Big files

The data collection is working perfect and isn’t having any impact on the trading, by which I mean it isn’t slowing it down at all. I use timer to track code execution times.

The issue is file size. At the end of each day’s trading, the data is saved to a new file. These are around 150mb in size. I keep all my trading stuff in cloud storage, which has its limits (capacity). It got me thinking, firstly to store locally, secondly, how to make use of it. This takes me back to where I started with data collection- what to do with it.

It’s all about analysis. If I work out the process of analysis, find the relevant data points I need to monitor, I can perform the calculations live and save the results only, rather than everything.

Having a good few days worth of data will be useful for testing but I don’t want to have endless reams of stuff I might never go back to. So once I’ve got a couple of gigs stored, I’ll stop the collection and start the analysis.

5 thoughts on “Big files”

  1. Of course, when you get onto to coding your own price bot with API-NG then the returned data will be JSON format, which is a database in itself.

    I keep all my data in JSON and then wrote a program to convert a subset of the data into CSV if I need to chart it on a spreadsheet or load it into a machine learning program etc. I have no need for a relational database. It would force me to look at data in a relational way. I’d rather be free of such constraints.

    BPT –


Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s