WTB – CREST market data endpoint

The popularity of sites like Eve-Central should go a long way to proving that the Eve Community wants market data.

Community projects like EMDR should also show that many people, myself included, consider it near vital that we want, if not instant, time delayed market data for use in projecting figures for manufacturing.

Unfortunately, the only way to get such data, in an at all timely fashion, is to cache scrape. At least some members of CCP have stated their disapproval of this method, which lead to outcry the last time the EULA was clarified. Leading to the current state of ‘It’s against the rules, but we’ll only do something about if if you do something else bad’. It also has lead to, with the release of Odyssey 1.1, at least a period of no market data, as the cache format has been updated, which means all the scrapers need to be updated to take care of this.

I’m calling for CCP to take steps to combat this, offering a couple of suggestions on how this could be taken forward. The first, leads to the least change required for the community. The second requires more change from us, but possibly the least from CCP.

Provide an EMDR feed

This would take the most effort from CCP. Have a client that queries TQ and throws market data out onto the EMDR network. Pretty much all the major sites which deal directly with uploads talk with EMDR. So this would lead to an immediate integration with those sites. There’d have to be thought given to the order of how the data is spewed from the firehose, more popular data more frequently, with less popular data, coming less frequently. So Tritanium in Jita would be frequent, while 100mm Imp navy plates in Minmatar faction warfare space might be once a week.

Provide a CREST Market data endpoint

With appropriate caching on the results, this is probably the best long-term solution, as it requires CCP to introduce no new technology to their mix. A pair of endpoints, where you can request the basic order information, or the history, for a typeid in a specific region. Cache it when it’s requested, serve that same data for the next hour. You’ll still have sites like eve central, amalgamating the data for different regions, providing historical analysis, and providing their own APIs for aggregate data for multiple typeids. You’ll probably also have services like EMDR taking feeds from those people querying the api, and providing the data to everyone else on them, reducing the requirement to do each of the direct queries.

CCP could even rate limit it per IP, and the players will take care of the rest.

 

Why

Perfect timely market data is a liability for traders. No exterior market data is more of a liability, as it precludes analysis.

No market data causes third-party developers pain. Kill boards won’t have an isk value to apply to kills. Manufacturers will be stuck updating spreadsheets by hand, which is far from a fun job. This will lead to burn out.

It’d be nice if CCP could do both.

I’m indebted to those individuals that have written the means for us to gather market data. I’m not looking to denigrate your work in any way, shape or form. I’m just not comfortable with the bus factor of such projects.

 

 

Of course, cache scraping is used for other projects too, such as libdogma, a very handy fitting library which can be integrated into websites. Without the ability to get that fitting data, fitting tools are very very difficult to implement.