Import•io allows users to extract structured data from websites without writing any computer code. Users simply navigate to a website and teach the browser to extract data by showing it examples of where the data is, learning algorithms then generalise from these examples to work out how to get all the data on the website. The data that users collect is stored on import•io’s cloud servers to be downloaded and shared.
Users can also generate an API from the data allowing them to easily integrate live web data into their own applications or third party analytics and visualisation software. The import•io data browser is free to download from the website.
Import•io’s mission is “to structure the web and make web data available to everyone.” Co-founder and Chief Data Officer Andrew Fogg said, “Web data is the biggest information opportunity for humanity since web search. Google solved the problem of searching for documents on the web, import•io is about accessing the data that is trapped inside those web documents so that you can actually do something with it. You can think of what we do as part of the 'big data' revolution but big data will only get exciting once we put the power to use it in the hands of ordinary people. The last really exciting data technology was Microsoft Excel, look at how that changed how all of us live and work. We believe that data is power, and we want everyone to have an equal share of that power. Along with companies like Tableau we are excited to be making the power of data available to everyone.”
Import•io launched into Developer Preview in November 2012 and won Best Startup at the O’Reilly Strata conference in Santa Clara in February 2013. The launch into Beta will occur at TechCrunch Disrupt San Francisco. Import•io will also be exhibiting at the GigaOm Structure Europe conference, 18 th September 2013 and the