Tag Archives: automated trading

Australian greyhounds and other activities

As mentioned in a previous post, an Oscar clone has been let out of the traps (ha ha, out of the traps – greyhounds, traps, letting out of the… yeah, yeah) and has been trading the Australian greyhounds for the past week. The liquidity is generally lower on these markets, especially for the earlier races but some of the later races are ok. With settings almost identical to those for the UK, only 27 races saw any trading with a total 115 bets settled. Profit of 27 pence (coincidentally) and £270.69 traded, giving a P&L/TV of 0.1%, which is not an unusual figure for Oscar. I’ll let it run for now to see what results come in.

aus_dogs_170521

Other activities

I know many of you will have it on your mind but for those that are new here – back in October I did a post on stakes ending with me wondering what to do with any profits. I gave four possibilities, as I saw it –

  1. Leave it where it is, doing nothing.
  2. Create a second Betfair account, for other/future bots or split activities.
  3. Remove from Betfair, put it in savings to be returned to Betfair when required.
  4. Remove from Betfair, spend it (I doubt this will happen).

Option 2 is not an option. I thought option 3 would be the one to go with but the wife suggested (after pointing out that there wasn’t vast amounts to play with ” and after all that time you spend on the computer“) that I should buy something – option 4 – which was MY LEAST favoured option. So after thinking it over and realising that she was right, because that’s how it is, I decided to part company with my faithful 20+ year old mountain bike (Claud Butler frame swapped for a scrap ’85 Ford Escort, crank and gears from a previous GT, all other parts swapped/added separately pre 2000, except tyres) and purchase a brand spanking shiny new modern lighter full suspension ally framed disk braked, mountain bike. I’ve been using it regularly for the past few months to try (really try) to improve my fitness. This is most certainly a work in progress.

img_20170114_140522068
Old
img_20170114_140422315
New (when it was new)

(p.s. I still have the old bike, just can’t bring myself to drop it at the tip after all these years)

 

Speedy data 2

My Speedy data post generated a few comments and some discussion. I really appreciate people taking the time to get involved and share their knowledge and views.

The first comments came via twitter from TraderBot (here and here) with a link to stackoverflow. This is a site I’ve found to be really useful to get help with many programming issues in multiple languages (That reminds me, I keep meaning to do a list of what I use – apps and sites). The Q&As linked to, although relating to Java, are an interesting read with an answer to the speed question, in summary, of “it depends on what exactly you want to do/measure”.

LiamPauling commented on the post asking where I’m hosted and do I stream? I’m cloud hosted and not streaming. He continues that he thinks bottlenecks are more likely elsewhere, which, after further reference in later comments, seems to be a good point.

Betfair Pro Trader asked why I wanted to use an array. It’s not that I want to use an array more than any other data structure, I was looking at getting the best solution, if such a thing exists (which is becoming less clear).

Tony, via Twitter, suggested running a test code with the different structures used. This could be useful but I was put off from this initially by the confusion I was getting from reading differing opinions based on various implementations of arrays, collections and dictionaries (and later, lists). At this point I was thinking that the optimum stucture is dependant on the specific use and there isn’t an exact answer to my speed question.

Next, a comment from Ken. He points to Lists as it’s something that he uses regularly and he talks of some of the benefits. Again, I’d previously come across articles saying lists were slow but maybe I was too quick to dismiss them. Betfair Pro Trader has also suggested using lists and dictionaries combined. Ken adds that he codes in C# (C sharp) but I think for the purpose of data structures and speed they are similar (they, C# and VB.net compile to the same language and run against the same runtime libraries).

n00bmind added a detailed comment. He makes the point that the advantages of one structure over another are not always so, as mentioned above. Also, he goes on to agree with previous comments that my speed question may be missing the main issues – those being the program/algorithm itself and network latency. Further advice is given about profiling (something, as a specific process, I haven’t come across before) and maybe using a different language, such as Python (I have only a basic understanding of Python from messing with it on my Raspberry Pi).

Finally, Jptrader commented, agreeing mostly with n00bmind, and others, about looking at “handling network latency properly and doing performance profiling”.

Although a simple answer hasn’t been found (because there isn’t one), I’m guided by these comments to focus more on my code, handling serialization and latency, making the algorithm efficient and using the data structures that work for now, whether that’s arrays, collections, dictionaries, lists or a combination of. Moving to another language just isn’t feasible for me at the moment, it’s taken me over a year to get a running bot in VB, with limited hobby time. I am happy to accept that another language may have it’s advantages, so would advise others to look at this for optimising their bots performance (for me the advantage will be seen moving from VBA to VB.net).

The testing I’ve done hasn’t shown any particular advantage of the different structures. From my searches on the web I think this could be due to the relatively small amount of data I’m handling (many articles talk of data lines in the 10s to 100s of thousands when comparing structures). An error on my part also had me making double calls for data with my bot which added to my difficulties and questions initially.

I have plenty to be getting on with for now and will continue looking to improve my bots. Thanks again for all the comments.

April ’17, with charts. And some charts for March.

Charts are back. First up are the charts for the period 13th to 31st March. The figures won’t match with the March post for obvious reasons but will follow on from the last P&L charts.

170331Aus170331

On to April. The dogs had an ok month overall but the profit mainly came in the first half and then not much after. No changes were made so not sure why it happened. Maybe the strategy is losing its edge.

170430

The Aus horses have not done well. I stopped trading on the 29th with a view to resetting some limits. It took me 5 days to actually sit down and do something. Minor changes made to the limits and the stake was reduced by 75%ish. Up to yesterday not much has improved.

Aus170429

I haven’t done as much programming as I’d like recently but the garden is coming on and I’ve done a few guvvy jobs to help pay for it, so not much spare time. I’ve had a few comments on here and Twitter about speed which I will put into a separate post when I get time. The latest one from n00bmind was detailed and worth a read itself.

In the meantime I’ve added an Oscar clone to the VPS to have a go at Aus dogs. No idea what liquidity is like but we will see.

March ’17

Results

In a change to previous reporting I am moving from week/weeks to monthly stats. And no charts. Here are the results from March, beginning to end –

UK Dogs  Initial slow start to the month but then went on to a steady return. Overall good result with no changes to be made.

Markets = 1595
Bets = 10409
Volume =  £51949.79
Profit =  £59.65
Return =  0.115%

AUS Horses  The first half of the month saw profitable trading but the second half was all over the place with regular enough runs followed by sharp drops, streaks of losing markets, with no further profit added (this has been the same for the start of April). If this continues I will pause Aus trading.

Markets = 821
Bets = 6606
Volume = £62277.75
Profit = £26.55
Return = 0.043%

Comment
Steve commented –

The API crashing should have very little effect on a bots overall profitability other than the fact you’re missing out on opportunities when the site’s offline.

In a way, I agree. If I have an open position when the crash happens, I will either win or lose an unusually large amount for me. If this happens regularly, then, so long as I have enough cash in the bank, these wins and losses should roughly balance out. (This is stretching the view of chance and puts a lot of hope on a balance being seen across a relatively low number of events. I’d rather not rely on this to cover the effect of crashes.) Steve continues –

Every bet you place with a bot should be sent because you believe it to be value at that time and it should be allowed to stand on it’s own merits.

I think at this point Steve has missed my approach to the markets. With regards to the outcome of the event, I have no idea if my entry point is at “value”. This is because I have no interest or care of the event. It matters nothing to me if it’s dogs, horses, pigeons, camels, Pooh sticks or bottled messages that are racing. I am trading on the market movement, not the event. I believe my entry point has a statistical positive value if I can exit shortly after. Therefore, I never want a bet to stand “on it’s own merits” because it doesn’t have any merits (on it’s own). Look at it like this – I think the price is shortening and assume that there are willing backers and layers in the market. I effectively jump in between a backer and a layer, giving them both slightly poorer odds than they could have got and skimming a little bit for myself. That’s how I see it. Steve goes on –

I can understand the mentality of wanting a green book at the off but trading will always be easier to do manually rather than having some set time or ticks to balance your bets.

I disagree. Steve finishes with –

There’s a lot of easy money to be made botting don’t go wasting your time trying to tick for pennies.

In short – “trading will always be easier to do manually” even though “There’s a lot of easy money to be made botting“. With that logic any mouse clicking screen watcher should be raking it in. This I doubt. And the idea of  any easy money left on the exchange is one I don’t believe. But if you have found it, screw it for every last penny and don’t tell anyone.
 

Weeks ending 12-03-17

Well nobody spotted last weeks howler – I only titled it “Weeks ending 06-03-17”. I guess you did see it but found more amusement in keeping quiet. You are fun.

algotradingforfun added this comment-

Great 2nd week there. Need to think about handling the bf crash scenario when in autopilot. I don’t think it would be a disaster if not about but does create some extra risk.

Thanks. For me the crashes can be a bit annoying. Oscar backs first so the greatest loss is the stake, assuming a clean cut crash. If you’re laying first the exposed risk between entry and exit is far greater, add multi-runner trading and that increases, something to consider when setting up a bot.

 

Mike also commented-

The regular Betfair crash is a royal pain. Your take of their response is amusing and spot on. There is an API status page (not widely publicized) which is a little more real time than the “help” desk. Don’t know if you can link your bot to the status but might be an option. http://status.developer.betfair.com/

Thanks, again. A pain, agreed. I saw this status link on Twitter for the first time after this last crash and it does provide some confirmation but did seem a bit delayed. After I’d first seen the tweets I looked at the status and only one request was showing problems (/listmarketcatalogue maybe?) so trial and error would see if it could provide any bot use. But it was certainly ahead of the Saturday boy and his well thumbed guide.

 

One week on these charts. Interesting profile on the dogs, start flat, end flat, with sharp rise Friday/Saturday. All figures are in line with previous period which is good.

170312

Aus170312

Another milestone was passed with these results, I became eligible to pay premium charge as my lifetime percentage dropped just below 20 to 19.92%. I’d already used some of my allowance which I think was linked to data charges that are no longer used. So this week saw £1.98 taken off my allowance; at that rate it’ll be 9 years before I actually pay anything. Unfortunately, if my total charges percent continues to fall, the weekly PC will rise. A drop to 19.72% would have seen a PC of £5.50. This is the price of (small) success. On a positive note this does put me in a bracket with 0.5% of customers which, if Wikipedia can be believed, is either 20,000 or 5,500 people. What joy.