Category Archives: Dev

Quick update for the blueprint list

Just a quick update on the blueprint calculator. (Well, It wasn’t that quick to do. But it should be very simple to understand.)

The blueprint list created by the form,, will now show an ISK/hr, an ISK/HR for POS manufacturing (basic arrays only.) and isk/hr for POS, with the datacores cost removed, for -4ME -4 PE blueprints.

It’s not perfect; T2 Ships can’t be made in regular ship arrays. This doesn’t pay attention to that. And R.A.M’s are kinda ignored, due to the rounding to less than 1 unit. But it’s indicative enough, and simple to check up on. All skills are assumed to be 3, when determining the cost of the datacores.

Long term, I’m planning on having it being possible to override the skills, but that probably won’t happen till SSO is available, as I don’t want to go to the trouble of writing a full user management system when something should be coming out some time in the next 6 months or so.

The github is mostly up to date, but I need to add in the details for how to manage the infrastructure needed, like a copy of price data in the database, possibly with a data loader, to pull it from my archived copy.

Updates and future plans

If you were to drop by the blueprint calculator, you’ll have noticed a few changes.

New features

  • Regional prices. You can now pick the region on the price list. It will always  default to Jita for now.
  • Collapsible price list. It will remember the  state you left it in.
  • Parser for a blueprint list. If you open up the S&I interface on the blueprint list (or corp blueprint), then double click on the location you’ll get a window you cut and paste from. Paste it in, that you’ll get a list of links, with a profit per unit.
  • General optimization. Performance since I added the table sorting has suffered. It should now be back up to normal.
  • Added a while back: Decryptors in the invention figures.

Future plans

  • Once the SSO becomes available, you’ll be able to log in, and store blueprints, if you want to. This will be on the server, rather than in the client. I’ll probably leave the option to store with cookies too, if you don’t trust me with that. Other preferences will be stored
  • Once CREST becomes available, with a character sheet end point, your skills (if desired) will autofill.
  • Inclusion of invention prices on the parsed results.
  • Inclusion of isk/hr (ish) on the parsed results
  • Addition of a menu bar, to bounce between applications.




Don’t worry about the security of SSO. The way OAUTH2 works is:

  • You hit the login button, which sends you to with a token that identifies where you’re coming from.
  • You log into Eve online, and pick the character you want to be.
  • You get sent back to the source site, with a long token.
  • The source site sends the token to eve online asking ‘who is this?’
  • Eve responds, identifying who you are and invalidating the token for future use. (CREST works a little differently but not significantly.)

At no time do you give anyone except CCP your password. Any third party site asking for it is either badly coded, or a scam. Ideally you’d log into the forums, and then you won’t need to type in a username or password into any site.

Updated Bill of Materials (BOM) SQL for the SDE

Just an update for my previous post on SQL for blueprints using the SDE. Now you can generate a complete BOM with a single statement.

I now have a use for getting /all/ the numbers out in a single query and that is what you’ll find below. You could even, if you so wanted, join it in as another table, against a table of blueprint product ids, MEs and a set PE, along with prices with some sums, to generate a full costing for a number of blueprints.

This is written specifically for use with an interface that lets you use named parameters (DBI, PDO, JDBC) and requires :typeid as the id of what you’re making. :ME the ME of the blueprint :PE the production efficiency of the character making them. It also requires a greatest function, which MS SQL doesn’t have; so you’d need to replace that with an if()

    select typeid,name,sum(quantity)+(sum(perfect)*(0.25-(0.05*:pe))*max(base)) quantity from(
    select typeid,name,round(if(:me>=0,greatest(0,sum(quantity))+(greatest(0,sum(quantity))*((wastefactor/(:me+1))/100)),greatest(0,sum(quantity))+(greatest(0,sum(quantity))*(wastefactor/100)*(1-:ME)))) quantity,1 base,greatest(0,sum(quantity)) perfect from (
      select invTypes.typeid typeid,invTypes.typeName name,quantity
      from invTypes,invTypeMaterials
      where invTypeMaterials.materialTypeID=invTypes.typeID
       and invTypeMaterials.TypeID=:typeid
      select invTypes.typeid typeid,invTypes.typeName name,
             invTypeMaterials.quantity*r.quantity*-1 quantity
      from invTypes,invTypeMaterials,ramTypeRequirements r,invBlueprintTypes bt
      where invTypeMaterials.materialTypeID=invTypes.typeID
       and invTypeMaterials.TypeID =r.requiredTypeID
       and r.typeID = bt.blueprintTypeID
       and r.activityID = 1 and bt.productTypeID=:typeid and r.recycle=1
    ) t join invBlueprintTypes on (invBlueprintTypes.productTypeID=:typeid) group by typeid,name
    SELECT t.typeID typeid,t.typeName tn, r.quantity * r.damagePerJob quantity,0 base,r.quantity * r.damagePerJob perfect
    FROM ramTypeRequirements r,invTypes t,invBlueprintTypes bt,invGroups g
    where r.requiredTypeID = t.typeID and r.typeID = bt.blueprintTypeID
    and r.activityID = 1 and bt.productTypeID=:typeid and g.categoryID != 16
    and t.groupID = g.groupID) outside group by typeid,name

Not a nice query, with more selects in it than I like, but pretty much needed with how it works.

Importing price data into spreadsheets

Importing up to date price data into your spreadsheets is a large part of being a successful manufacturer or trader in Eve online. Some bypass it by using tools, but the tolls will still have to do it.

I’ve posted about how to load the data from eve central into a web page, but I thought it was time to talk about how to do it, into a couple of different spreadsheet  packages. Namely Excel and Google’s spreadsheet. Open Office is quite a bit harder to do XML imports with, so I’m just mentioning one option right down at the bottom, which is a lot less flexible.

Your first choice is the source of your data. There are two main sources, Eve-central, and eve-marketdata. They’re pulling data from pretty much the same source, so there’s not a lot in it.

Eve Central’s API

This is fairly well documented at The short version is:

  • Create a URL to pull the data you want.

This will look something like:®ionlimit=10000002

If you want more types, just add them to the url with &typeid=36 etc. The numbers are the typeIDs from invTypes.   Just grab an up-to-date copy from my data dump conversions when you can’t find something. It’s not a bad idea to include this, or a cut down copy consisting of typeid,typename,typeid in your spreadsheet on a separate worksheet, for use in vlookups.

Eve Market Data’s API

I’d recommend this one if you’re going to be pulling a lot of data, as it’s possible to pull the entire market in one go, for a particular station or region. Documentation can be found on their site

The url will look like this:,12068&region_ids=10000002&buysell=s

Just add the typeids to the comma separated list. Or remove &typeid=[stuff] from it, if you want everything. Change the character name to your own.

For Both:

If you want the station IDs:
If you want system IDs:
If you want the region IDs:
If you want the typeids: or



Be glad of heart, for Microsoft made this really simple. All of this applies to Excel 2007 and above. My screenshots are from Excel 2013 (Office 365 home premium is pretty good.)

On the Data tab


Pick ‘From Web’


Fill in the url you have, into the Address bar. Hit go. When it’s loaded, it’ll look like this. Hit import.


Just hit OK to this. It’s not important.


I normally put it into a new worksheet, to keep it out the way.

Once it’s loaded in, you can update the data by hitting the ‘refresh all’ option on the data tab. It /should/ refresh when it’s opened.

You now have market data in your workbook, that you can get up with a vlookup. See later for an example. You can download the example workbook here.

Google Documents

I’m not going to go into screenshots. You can find an example of how it’s done here. I thought this was probably the base way to do it.

Short version is:

Get the url, and use importXML() with it.

With Eve market data I suggest, instead, using importdata(), and the text format. It makes it a /lot/ easier to get everything you want out of it. The url is almost the same, except it’s txt, rather than url. you then split it on tabs ( char(9) )

Be aware, google isn’t great when you’re working with a lot of data. A desktop spreadsheet package works a lot more smoothly. Google will run into processing limits pretty quickly.



Learn to love this function. It’s very very useful for what you’ll be doing.

=vlookup(what you want to search for, where you want to find it, which column to return,if it’s not ordered data put true here)

so you could use =VLOOKUP(34,Sheet2!D:AT,34) to get, from the excel sheet above, the percentile price for Tritanium (typeid 34)

And you could use use =vlookup(‘Tritanium’,typeids!B:C,2,true) to get the typeid for tritanium from a worksheet called typeids, with typeid,typename,typeid as columns.


Open Office/Libre Office

Doing an XMl load into open office is a complete PITA, involving a lot more work. There are no screenshots, as I don’t have it installed any more. What I’d suggest doing is using the text option from eve market data.

Use ‘Insert’->’Sheet from file’

When it asks for a file, give it the full url (like®ion_ids=10000002&buysell=s )

Hit open and wait. It’ll take a while to work the first time. Eventually, it’ll pop up the text import screen. Make sure tab is selected. Hit ok. It’ll take a while before it’ll become responsive, but eventually ‘sheet1’ should show up in the from file box. Make sure the link checkbox is ticked. Hit ok. You now have a sheet that should reload whenever the workbook is opened. It can be vlookuped in the same way as everything else.



Using the Eve Central API with PHP

We’re going back to basics here. One thing that many sites, which are using the SDE from Eve, want to do, is access price data from EVE Central api. I don’t do this myself, having a EMDR relay, and consumer to provide my market data, but back when I was starting out, I did this. It’s a pretty simple thing to do, the only pain in the neck being the data structures that PHP wants to use, when you’re using XML. It likes throwing in arrays where you don’t think it should, which can be a bit of a pain. If you ever need to debug this, var_dump() will be your friend, showing you exactly how the data has been structured.

So, here’s a fairly basic bit of code. It asks eve central for the information for Tritanium, then prints out the so called ‘Percentile’ price. This is the price if someone was to buy 5% of the market, then average the cost out. It’s handy for ignoring the outliers, if not entirely accurate. The code is more complicated than it needs to be, but we’ll get to why in the second example.

$xml=new SimpleXMLElement($pricexml);
$price= (float) $item[0]->sell->percentile;
echo $price;

Fairly simple, really. You get the type id for Tritanium, the url that will pull back the data from the forge, mash them together, then use file_get_contents to retrieve them. Please not, some PHP installs will stop this from working. There’s a more complicated version at the bottom which may get you round this; functionally identical but using curl instead of file_get_contents.
Once you have the output from EVE Central, you push it into the SimpleXML parser, grab the bit that’s just about tritanium (the XPath bit does this. This is the over complicated bit), then pull just the sell percentile out of it. The reason for the [0] is because the Xpath bit could have returned more than one entry. In this case it can’t, but the parser is taking the safe option and giving you an array. The (float) is there as without it, you’re dealing with a SimpleXML data type, which is a pain to do math with. (float) just converts it into a regular number.

Now for the more complicated version. Eve central allows for multiple items to be queried at the same time. This is why I put the xpath in there. It allows us to target a specific type being returned, rather than having to go through each in turn to see if it’s the one we want, before continuing.

    $xml=new SimpleXMLElement($pricexml);
    foreach($typeids as $typeid)
        $price= (float) $item[0]->sell->percentile;
        echo $typeid." ".$price."\n";

This time, instead of a single type id, we’re asking for multiple. We store them in an array, then mash that array into the url, with a join (or implode) which adds the right number of &typeid= to it. Then it goes on as before until just after the xml is parsed, where it goes and grabs each individual element out in turn, iterating through the array.

If you wanted to pull the minimum buy price, you’d replace the $price= (float) $item[0]->sell->percentile; with $price= (float) $item[0]->buy->min; and so on. It’s fairly simple.

Now, the curl version, of the first bit of code:

$ch = curl_init($url);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1);
curl_setopt($ch, CURLOPT_HEADER, 0);
$response = curl_exec($ch);
if($response === false)
    echo 'Curl error: ' . curl_error($ch);
    $xml=new SimpleXMLElement($response);
    $price= (float) $item[0]->sell->percentile;
    echo $price;

It’s a bit more complicated, but it gives you a way of dealing with a remote site not responding. Perhaps looking at a different site for price data, for example.

Remember, the more calls you have on remote sites, the slower your pages will be to load. Consider caching the price data locally, perhaps in a database, or if you have access to it, memcache (My preference. but it’s uncommon)

Dev Track Day 2 – Part 2

This was the WebGL session.

It’s great. Really really great.

The only problem is, potentially, delivering the models to the clients. Other than that, it’s wonderful. If you can take the loading time, it’s great. go fork it.


Not a huge amount to say really. It was demos of graphics stuff. which looked great.

Dev Track – Day 2 – Part 1

Just a small gap between the SSO session and the WebGL session.

The SSO looks to be fairly simple to work with. It’s a shame that the demo mobile applications which were demonstrated didn’t use a native browser, using an embedded one, which is far from good practice with OAUTH2, but they do make them easier to understand, with less explanation needed (Custom protocols for the call back to get the token to the application, for example. I believe Aura does this with getting the keys from the site.).

It was fairly obvious, from some of the questions being asked, that there were some fundamental misunderstandings, but in the end, it works well enough. More demonstration code will be needed, but I’m sure the community will provide some. I’ll probably do it, when I kick it around myself.

There was some discussion at the end, about situations like character transfers not being obvious to third party sites, but the devs seem receptive to possibly adding a userid characterid based hash. So you can’t tie users together, but you can tell when one is transferred.

No ETA on deployment, unfortunately. As it’s tied to the dev license, I wouldn’t expect it until at least the next draft (30 days+)

There was also a request from the devs to make sure that we make it obvious on the forums if we’d like an updated IGB, and two factor authentication. I’ve added posts in the tech lab, so if you could go and wave in them, I’d appreciate it.

Dev Track – Day 1

Just a quick post, jotted down before I go to get some lunch this morning. the Dirty breakfast at the laundromat is calling my name, and it’s quite persuasive.

You can see the sessions at, just hit the videos link on the right hand side. I’d especially recommend you watch the legal one, starting about 4 hours, 24 minutes into the bigger video from yesterday, especially if you’re running a site with any advertising or donations links. There’s a paragraph further down with some advice, if this applies to you.

You can also see me! The bald looking guy in the middle, about 4 rows from the front, dressed in black, with glasses, a beard, and a laptop (this one that I’m typing on right now) I do actually have some hair, but it’s very short, and not that noticeable at range.

The Overview session was just that. Going over what we were going to be talking about, and the format changes. I was quite comfortable with the changed format, with the devs up the front on couches and seats, and us in the fairly large hall. Might not have been quite so intimate as the roundtable format, but I liked it. Still got to ask questions, and the answers from the devs were more verbose. And the quality of the stream was far higher.

We should be getting a developers site at some point in the future (Soon™), with more technical blogs, the signup stuff for the developers license, the application key setup and so on. Not delivery date, but hopefully very soon.

The first real session was the CREST/API session. The API is under development once more. Mostly for bug fixes, and eliminating annoyances like the 119 error for killmails. Woo! 🙂 It’s been handed to CCP Prism X and we should be seeing some work done on it. As well as a better hand over process, when/if it leaves his capable hands. Until now, it’s been very much a side project, which is less than ideal. Don’t expect many new feature from it, because of what it’s like, and because of CREST, but some stuff may be happening.

CREST, as always, looks great. And there appears to be a commitment to open up as much as game design lets them. And badgering other devs to write the interfaces when they put in new stuff. They’d be happy enough to open it all up, but it’s philosophical reasons not to. Such as: don’t open up the market entirely, as then it will be botted to hell. There’s a few trains of thought on this matter, and it goes well beyond the scope of this post to discuss them all. I’m of the opinion that some areas should be left closed, to prevent the requirement for using third party software to be competitive to some level. Others are ‘Publish and be damned. The metagame will sort it out’. It’s a philosophical debate rather than a technical one, so it’s not likely to be settled any time soon.

The Licensing and Policy session was very much a discussion. A new version of the developer license should be released within 30 days or so.

Under the current, and the public Future, terms, you /cannot/ charge for an application. Right now, advertising and donations are more than a touch iffy (if you’re using CCPs IP, like the art, SDE or api).

Watch the session. I spoke to CCP Seagull after it, and she’s made it very clear they doesn’t want to screw over any sites which aren’t there to make a profit. When it’s just to cover their costs. But from a licensing perspective, it’s a trifle more difficult. Especially as this is using global law, which makes it complicated. If you are having trouble, email them at and Open the dialogue with them and you may be able to get a license which covers both you and them appropriately. This doesn’t appear to be CCP Games trying to crack down on the third party devs. It’s just trying to protect themselves from backlash from their partners (who have to profit share, and can get annoyed (legally) at people getting stuff for ‘free’)

In a shock announcement (it wasn’t on the schedule) it looks like the SDE may well be coming to us as an sqlite file, reversing all the yaml stuff. =D It’s for development technical reasons, one of the main ones being: During the build process, they don’t want to have to spin up any sql servers. So SQL lite lets them create the database as a build artifact without doing so. We should be getting a copy of that. There’s still some restrictions on /what/ will be in it, but those are, again, game design reasons, rather than technical ones. As before, I’ll see about republishing it in mysql, as I’ll be doing a conversion for myself.

All in all, a pretty good day. The legal talk was a touch disappointing, if entirely understandable. If you have any questions you want to ask during today’s session, you can try asking on the twitch stream. Or see if you can get me to ask, through the #eve-dev on

Hmm. Destiny is calling my name. Or is it just breakfast. Either way, I go to answer the call.

Market Data API – What I’d like

There’s been a bunch of posts on the forums about how cache scraping is bad and against the EULA and how CCP Sreegs hates it. Now, it’s been said by some GMs that it’s ok, so that’s not something I’m going to go into.

Sreegs did say that, while he doesn’t like scraping for market data, he’d prefer to see that in an API. This is something I heartily agree with. We need a source of market data, available outside of the game. There’s a few things which I’d want from it, however.

  1. The data, while accurate, shouldn’t be ‘perfect’.
  2. To make 1 work, this means you don’t get to ask for data for a particular item, in a particular region, at the current time. No requests for specific data.
  3. It should, however, be timely. For common data, I should be able to get it fairly up to date.

The thing is, right now, I have a source just like that. It’s EMDR. And it’s pretty much perfect. A firehose of market data, to do with as you please.

It’s not an ideal source for use with spreadsheets. You’ll need something listening to it 24×7, and then processing it. But that’s what eve central does right now. It’s what I do right now. I do provide exports of the data, for use in spreadsheets, which quite neatly takes care of this.

So, the actions I think CCP need to take:

  • Define a rarity list of items and regions. Common items and regions should be updated often. Rare ones less often. So if you’re listening to it, you should get the mineral prices in jita every 20 minutes or so. Imp navy large emp smart bombs in 760-9C, however, should be maybe once a day
  • Set up a zeromq (or similar. No real queue, just a stream) end point for people to subscribe to. Maybe add an IP level filter to allow people to subscribe to it with their eve accounts, with an ip address. It’s not suitable for the general person, but it’s perfectly good for the enablers (such as myself)
  • Sit back and watch the data stream out.

Yes, it’s not a minor job. But it should reduce the number of characters who just sit in a market hub requesting market data once every 2 seconds.

And it totally removes the possibility of people poisoning the data (unless they provide a service themselves)

Useful bits and pieces – Eve SDE

Once you get comfortable with the CCP way of doing things, the SDE isn’t too bad to work with. You’ll have to dig around in it to get some of the numbers you want, and sometimes it’s the opposite way round from what you’d expect, but it’s not too bad, most of the time. Here are a few more useful snippets for getting information on things.
Continue reading