DataMarket blog

Data, visualization and startup life

Season’s Greetings from DataMarket

leave a comment »

To all our users, friends and followers:

xmas2012

Thanks for a wonderful year of data and joy,
The DataMarket Nerds

Written by Hjalmar Gislason

December 23, 2012 at 2:02 am

Posted in Uncategorized

Data Visualizations and Storytelling

with 3 comments

Slides from a presentation I did at the Data Scientist Seminar Series in Boston, December 3 2012.

Note that most of the images are links to relevant data, technologies or additional information, so click around!

Written by Hjalmar Gislason

December 4, 2012 at 12:15 am

Posted in Uncategorized

Presentation at Strata NY, October 2012

leave a comment »

Below are the slides from my presentation “Best Practices for Publishing Data” given at the Strata Conference in New York, October 2012.

The slide deck includes a lot of links to additional resources, so go ahead and click around.

Note that this is an enlarged and improved slide deck from a presentation with the same title from Strata London.

Written by Hjalmar Gislason

October 25, 2012 at 6:40 pm

Posted in Uncategorized

DataMarket – Energy: New Business enabled by Open Data

with 2 comments

Last week DataMarket introduced a new product, an energy specific data service called simply: DataMarket – Energy

The venue at which we introduced the service was quite unusual. We were lucky enough to be invited – along with a selected group of other startups and innovators working with energy data – to present our work at the White House at an event called Energy Datapalooza. We have since jokingly said that in order to top this venue for our next product announcement, we will have to book the International Space Station. I’m working on that.

Here’s a short video of my presentation there and the unveiling of our new service:

For those of you that have been following DataMarket for a while, you will notice that the business model for this new product is significantly different from what we have previously been running with.

When we originally kicked DataMarket.com off with international data early 2011, there was only one thing users could pay us for: A low-priced premium subscription that gave access to additional features, such as more advanced data export formats, automated reports and a few other things. A couple of months later we added the first premium data to the site; data from premium data providers such as the Economist Intelligence Unit (links to EIU data on DataMarket), resold through our site.

However, using the site’s core functionality – the ability to search, visualize, compare, and download data from the vast collections of Open Data that we aggregate – has always been free. As such, DataMarket.com has become quite popular in certain circles. But quite frankly, the two revenue sources have not taken off in a big way.

What has however taken off is our technology licensing business. We’ve seen high demand for our data delivery technology from other information companies. The ability to normalize data from a wide variety of data sources, and enable users to access that data through powerful search and online visualization tools is something many information companies, such as market research and financial data companies, have identified a strong need for. So last February we formally introduced our data publishing tools, most prominently what we now call the Data Delivery Engine, a white-label solution that is already up and running for a few well know information companies, (including Yankee Group and Lux Research) with several other in the implementation stages. This licensing business is where most of our revenues comes from today, so one could really say that we’re now more of a software company than a data company.

The upcoming launch of DataMarket – Energy is another stab at the data side of the equation, but the approach is different in several ways:

  • Focus and scope: By focusing on a single industry or vertical we can make the service much more relevant to its users. Instead of solving 10-15% of everybody’s data needs with the kind of macro-economic and demographic data that can be accessed on DataMarket.com, we aim to address 90-100% of the data needs of a much more targeted audience.
  • Premium access: We’re selling access to this service at a substantial premium (final pricing is still being decided). Those that see value in the discovery and aggregation services that we add on top of the data will be charged for the “job they hire our product to do”. This indeed means that some data that has been made publicly available for free (Open Data) will only be available to DataMarket users behind a paywall. As explained in the presentation above, that doesn’t take the least bit away from the value of the Open Data. On the contrary: The data is still available in its original form from the publishing organizations, but we add a choice on top of that: A nicer and more user friendly way to access the data for those that are willing to pay for that value-add.
  • Targeted sales: Instead of relying as much on PR and viral distribution as we have with DataMarket.com, we’ll use more direct, traditional sales approaches for this new service.

One of the interesting things about running a technology startup is that the same technology can be turned into so many different products without a single line of additional code. Often the only difference is how you promote it, price it and sell it. This can be both a curse and a blessing, and usually a few things need to be thrown at the wall before you find what sticks. Luck is involved too, but as the famous Norwegian Swedish alpine skier Ingemar Stenmark is quoted saying: “The more I practice, the luckier I get“.

It will be interesting to see if we’ve practiced our data marketing skills enough for the DataMarket – Energy approach to work out.

Written by Hjalmar Gislason

October 12, 2012 at 8:16 pm

Posted in Uncategorized

Best Practices for Publishing Data

with one comment

Slides from a presentation given by Hjalmar Gislason, founder and CEO of DataMarket at Strata Conference in London, October 2012

Written by Hjalmar Gislason

October 2, 2012 at 3:08 pm

Posted in Uncategorized

Worse than a 3D pie chart

with 9 comments

I have seen my share of good charts and I have seen my share of bad charts, but I never expected what I saw today.

As you may know, Hjalli and I are writing a book about chart design. We will guide you through choosing the best chart for your story, and to create beautiful and effective charts. The book will be aimed at those who want, or need, to get a chart out there but aren’t that interested in the why’s. We start by looking at the charts the big boys do by default and go from there, examining the parts and how to improve them.

The first chapter about chart design is about tables, which was fun to write. There was more to say than we expected.

The next chapter focuses on line charts, where I used Numbers, Excel and DataGraph to create default versions of a line chart. As I knew, there were things that could be better designed in the default versions of all applications. None of the defaults is useable in our opinion. Numbers the least.

Today, I dove into the details of my bar chart design. Called up the author of DataGraph to discuss moving axis labels by a pixel. Stared at my screen for half an hour, wondering if I want to keep the x axis on the bar chart or not. Then I opened up Numbers and Excel to create the defaults. DataGraph was already open, since I can almost do perfect charts in it already. So I started with the DataGraph default. It disappointed me a bit. The y axis didn’t automatically label my bars. Other than that, it was not pretty but useable. Numbers, to be fair, does automatically label the bars.

Next up was Numbers. At first glance it looked fine, the color of the bars was okay and the bars were labeled correctly. As I checked off items in the designing-a-bar-chart list in my head, everything seemed fine. Until it didn’t. At all.

In disbelief, I went straight to Excel to see if this alarm went off there as well. It did. And there was much wailing and gnashing of teeth. I felt like George Taylor in the Planet of the apes: “You Maniacs! You blew it up! Ah, damn you! God damn you all to hell!”

The default bar charts from Apple Numbers and Microsoft Excel.

They didn’t include the zero on the x axis! This is no small omission. Their default bar chart is a lie! When comparing the bars, you must compare the full length of the bars.

You can tell both applications to include the zero, but that should not be needed. Creating a bar or column chart without the zero on the axis shouldn’t even be possible. This is worse than a 3D pie chart.

There, I’ve said it.

Written by ├×orri

September 13, 2012 at 10:21 pm

Posted in Uncategorized

Instant Feedback: It Applies Everywhere

with one comment

If you work in software development and you haven’t watched Bret Victor’s presentation “Inventing on Principle”, you must do so now:

This video made its rounds among developers and general nerds early this year to much fanfare. The key take-away – at least for me – was that in order for a creative process (like programming) to be effective you must remove the things that stand between an action and its effect as much as possible, making the whole process more like the real world – more “tangible”. Or in Victor’s words (recited from memory):

Just like a painter immediately sees the effects of his brush strokes [...] a coder should immediately see the effects of his code changes

This presentation has inspired several projects based on the “instant feedback” concepts Victor sets forth so powerfully.

Several of these projects have been coming to fruition over the last few weeks, including:

These – as well as most of the examples in Victor’s original presentation – are all IDEs (nerd-speak for whatever nerds use to write programming code).

But while listening to Gabriel Florit presenting livecoding.io at the Boston DataVis meetup yesterday it dawned on me that this is not just about coding.

Yes, I am *that* slow (or was so distracted to begin with by thinking about how empowering these concepts will be for coding) but: Instant feedback should be the default behavior for all software.

Granted, the feedback problem is particularly bad in software development, but think about all the other software you use – or develop:

  • Wherever there is an “Apply” button there is room for more instant feedback.
  • Wherever these is a modal dialog window there is room for more instant feedback.
  • Wherever the user doesn’t see the effects of a change or a choice until several choices or commands have been made, there is room for more instant feedback.
  • Wherever the user is changing something he’s unable to see and most hold in memory or imagine the results of his actions, there is room for more instant feedback.

…and it will always make the software feel more tangible and “natural” to use.

Of course there are cases where – for performance reasons or otherwise – this may not be feasible, but the industry as a whole is way too stuck in the “Apply changes” mindset.

I can definitely see a number of improvements we can and will do on the – already pretty dynamic – interface of DataMarket.com guided by these principles.

I must echo what my colleague, Vidar Masson, said this morning, talking about Bret Victor: “People are going to remember this presentation for a long time!”.

Indeed! And its effects will reach far beyond the IDE.

Written by Hjalmar Gislason

August 22, 2012 at 5:57 pm

Follow

Get every new post delivered to your Inbox.

Join 63 other followers