Saturday, February 21, 2009

NY Times TimesOpen Recap: TimesNewswire API Coming Next Week

new york timesToday was the first TimesOpen day at the NY Times headquarters in NYC. I was able to attend the morning sessions and I'd like to share my notes and a couple of photos. I think the event was very professionally run and the room was completely packed. It's interesting that they didn't hold the event in what appears to be a beautiful theatre next door to the HQ building. Check out my notes and slides from the Tim O'Reilly keynote as well.

In the audience were people from Google, Yahoo and a good number of other large tech companies. I also saw a variety of bloggers in the crowd but it seemed like there were more large companies than say indie developers in attendance.

The big news coming out of the morning sessions was that there is a new API launching next week called TimesNewswire. This will give developers access to live headlines. Attendee Kellan called this new api a New York Times firehose and noted, "NewsWire API is the paper's stream of consciousness."

President and Chief Executive Officer Janet Robinson welcomed everyone to the event and she said that everyone in attendance is part of the past history of the paper but a very important part of the future of the Times.

The concept of the day was to bring technology and the future of the newspaper together.

The other executive who spoke noted that they have an intense desire to make sure the content is personalized going forward.

They gave me a black t-shirt (size L) with the TimesOpen logo on it. If you would like the t-shirt, just leave a comment and I will pick one comment at random.

O'Reilly Future of Newspapers

O'Reilly Future of Newspapers

Wait a Moment... Who is the Desperate One?

demo techcrunch50Last night I read a hit-job post like I haven't seen since my days on the G in Brooklyn during the 70s. The hit-job I am referring to is Erick Schonfeld's piece about Matt Marshall joining the DEMO conference team. Apparently because one person (Chris Shipley) decides to change her focus after running the conference for 13 years, they must be "desperate". Erick also makes the following observations, "He is going to have to reinvigorate a dying brand." and "It is fine by us if DEMO sticks to its model of extorting startups". I can't believe we are still talking about this tc50 vs. demo crap.

Erick uses the post to explain that his conference, Techcrunch50, is the much better model and that now he will have to "crush" VentureBeat. He notes that techcrunch50 companies get in on merit (oh is that how they do it!). Erick also displays a chart displaying the very little traffic that demo.com receives but leaves the techcrunch50.com site out of the chart - the comparison chart is available below for reference.

I wrote an in-person review of both Demo and TC50 from last year as I was one of three people to attend both events. It was great to meet so many CN readers at both events. After the unprofessional treatment I received at TC50, I won't be attending this year. I am not going to go into the behavior here but suffice to say that even the event staffers thought the behavior was unprofessional.

The truth is that the numerous stories I heard from the entrepreneurs of the demo pit startups, the ones that pay $3k/day, were not good. But, as one might guess, no one wants to speak on-the-record because they are afraid.

It's totally understandable that the Techcrunch team is probably a little upset because Time magazine ranked their top competitor Mashable as a top blog for 2009 while Techcrunch was listed as "overrated". Might these be some additional reasons why Techcrunch might be the desperate one in this conversation?

  • They've added a "javascript page refresh" - this means that if you leave the site open in a tab, it will refresh every once in a while - extra pageviews for the site
  • They continue to increase the page views required to read the comments - first it was 100 on a page and a link to "view all comments" - now that link is gone and less comments are viewable on each page
  • They have decided to break embargoes when it suits them to make sure they appear first
  • Even with their reportedly strong-arm tactics, startups are starting to provide news to everyone but Techcrunch and as I've said since the early days the best route is to provide the news to all of the sites you trust to honor the posting time and get the most coverage and feedback you can.

Patricio Robles from eConsultancy has a good post reviewing the same post. He talks about why positivity sells; I guess he doesn't read the aformentioned blog often. Let's hope that Erick thinks about all the negative feedback on his post and offers up an apology to Matt and Chris, even if it's handled privately.

Facebook to Users: We Are Sorry, Please Don’t Delete Your Account

It looks like lot of Facebook users have deleted (or are in the process of deleting) their Facebook profiles due to the recent uproar over Facebook’s new privacy policies.

Facebook has since then reverted to the old terms of service but that may not have prevented exodus so they now also show a "we are sorry" message to anyone who tries to erase his or her Facebook account.

delete facebook account

The highlighted text reads:

Are you deleting because you are concerned about Facebook’s Terms of Service? This was a mistake that we have now corrected. You own the information you put on Facebook and you control what happens to it. We are sorry for the confusion.

You can see this page live here but please don’t click the submit button else it will permanently delete your Facebook account and there’s no way to reactivate a closed account. Thanks Jeremiah.

Single Google Query uses 1000 Machines in 0.2 seconds

google production serverGoogle is normally quite secretive about their search infrastructure but, in a break from tradition, they have revealed that a single search query on Google can consume the processing power of 1000 machines.

Google Fellow Jeff Dean, in a keynote talk at WSDM 2009, shared some numbers about Google’s impressive growth run from 1999 to 2009. According to Dean, while both search queries and processing power have gone up by a factor of 1000, latency has gone down from around 1000ms to 200ms. Crawler updates now take minutes compared to months in 1999.

Another significant change was the switch to holding the complete search index in memory, resulting in the use of 1000 machines to handle a single query compared to just 12 previously.

This revelation may be a bit embarrassing for Google, which has defended its ecological record in the past, claiming that a single Google query takes just 0.0003KWh of energy and that the Google datacenters are "the world’s most efficient."