This Week in osu!

published 07 Dec 2012

This week I set aside a good block of time to replace the registration and activation process for osu!. The main reasoning behind focusing on improvement in this area is the number of emails I receive each week from people who have issues with either the captcha or email verification/confirmation, requiring manual intervention.

  • Registration process was completely rewritten to be as simple as possible. Rather than hinging on standard verification methods, new users are now only asked for their username, email and password (and only once for each). Once these are validated, they are asked to login to osu!.

    registration step 1

    The login to osu! not only double-checks they entered the password they think they did, but it also means they are instantly logged in without any annoying email checking getting in the way.

    registration step 2

    The main disadvantage to this change is that anyone who cannot access an osu! client will not be able to register. This means that (for the moment), osu!droid or osu! iPhone users will not be able to participate in the forums. I may provide a solution for these people in the coming week(s).

    For a limited time, you can test the new registration process for yourself on the test server. Note that you will need to run the test build of osu! to complete activation. Accounts created here will be deleted around every three hours, so don’t worry about making more than one (or entering valid details).

  • When creating new accounts, users are now warned if it is likely they are about to (unknowingly) create a multi-account.

    warning

  • The download page got a much needed redesign. I plan on doing some more work here, but it is already feeling a lot nicer.

  • Next up is fixing the email change, password change and forgotten details processes to bring them in line with the quality and simplicity of the new system. I hope to complete these next week.

  • Tournament spectator client has some upgrades! It now synchronises playback between all clients, which looks pretty damn cool in practice. I can’t wait to see what people think during the OWC matches this weekend.

    The spectator setup process is now completely automated, meaning streamers no longer need to track player or song changes. On providing a multiplayer match ID, osu! will handle the rest. Automation Ho!

  • Previously mentioned bancho protocol optimisations have gone live and have reduced overall outwards traffic by around 5-10%. This is quite a substantial improvement – especially since this saving occurs during game client connections, reducing the delay before users can enter multiplayer etc. by around 20%. I still have a few more improvements up my sleeve for further down the track :).

  • I changed the file structure of avatar storage to allow for easier access from multiple sources. File extensions are no longer provided, and as such file types are determined during client-side display logic. As a result, there are no longer conflicts when users upload multiple formats of avatars over a period of time, and the logic for displaying avatars is super-simple. As an interesting tidbit, osu! has over 5.6gb worth of 128x128 avatars in its store!

Going to be another busy weekend with OWC matches throughout Saturday and Sunday. Make sure to tune in and witness the improvements to the streaming setup!

comments

This Week in osu!

published 01 Dec 2012

With the OWC tournament ongoing, I have been helping out in that area and further improving broadcasting tools. As a result, not much visible progress was made this week.

  • Multiplayer history pages on the osu! website now update in real-time! I really wanted to provide an up-to-date version of scores without forcing people to mash the refresh key, and am happy with the result. Updates propagate within 5 minutes of in-game events. I also re-designed these pages to be more readable, and use the new beatmap info panes.

    multiplayer history

  • I have made further improvements to the tourney spectator client, removing file access conflicts when downloading new beatmaps and/or processing the beatmap database. Each of the clients now communicates with each other to schedule updates, with the first client being the only one to perform the grunt of the work. Country labels were also added, so streaming plugins are no longer required for these details.

  • When making some web changes, I found some code which was used for user rating of maps that had not yet been updated to using the latest database API, so I spent some time fixing this up. In the process I removed a good 0.5gb of redundant data and improved the efficiency of beatmap ratings by a whole lot!

  • A lot of new localisable fields were added including newer options screen text (by woc) and main menu tips (by me). I made further improvements to the usability of the localisation spreadsheet, as well. There are at least three new languages this week which went from non-existent to 100% completion. Let’s say I am very impressed with this outcome – although I’m not as impressed that the Japanese translation is completely stagnant at lower completion than the existing v1 translation (any takers?!).

  • I have started automatically reporting beatmap corruption to the server so I can figure any remaining causes of this and iron things out.

  • I have begun micro-optimisations on the bancho network stack to reduce the size of user presence packets and hopefully speed up the initial osu! server connection – which can be seen struggling when there are 8,000 users. It is currently sending about 250kb of data, and so far I have reduced this by 35%, and hope to gain further reductions before making this live.

  • I received my t-shirts and hoodie order from the osu!market, and am thoroughly pleased with the quality! We have also recently added some mugs to the store with the same designs, so go check them out if that is your kind of thing :).

    linkosu!

As a reminder, we have been streaming the osu! World Cup via twitch, and hitting around 3.5k viewers at peak, so make sure to check out some of the matches (the schedule is listed below the stream, and you can also access recording as well in the “Videos” tab).

comments

This Week in osu!

published 23 Nov 2012

Another relatively busy week on the server maintenance side of things, switching out the old bancho server with the new one. There also seems to be a sudden surge in tournaments being organised, which triggered a temporary change in priorities!

  • Bancho is now running on the new server. The switch was made at low-peak on the 19th, and there was no downtime experienced. This did require an osu! client update, which was rolled out over a few hours beforehand. Non-updated clients will not be able to connect, but as always, are automatically updated.

    The new server is provisioned for an extra 66% bandwidth (allowance, not speed), and should be able to handle increasing traffic for well into next year.

  • I shifted server monitoring to a hosted service (ScoutApp). This was primarily done to alleviate the risk of failure (and maintenance) involved in running monitoring myself, which was previously done using Cacti.

    ScoutApp was my choice for monitoring because it offers reasonable pricing and easy setup. I am still running my monitoring using their one month trial, but am happy enough to continue using it into the future.

    sample dashboard

    The ability to log many stats and create dynamic graphs at any point in the future makes ScoutApp super powerful. It also has an open plugin architecture, which allowed me to graph .NET metrics even though they don’t actually support Windows Server as an OS.

  • The screenshots page now shows a selection of relatively-popular user uploads (those things that come from hitting Shift-F12 in-game).

  • I implemented a new spectator mode for tournament use, which automates the process of setting up an array of spectator clients to live-stream tournaments. This will be used for streaming the upcoming osu! World Cup #3 (starting in just over six hours, running for a month or so) and a smaller Japanese osu!mania tournament running this weekend (Saturday UTC11:00~).

    I have plans to further improve this functionality and make it semi-public, so anyone wishing to use it can apply for temporary access. It is currently available via private applications. If interested, please make sure your tournament is well set-up, has a thread and reliable staff and is fully scheduled before contacting me via PM for access.

  • While investigating performance issues (latency occurring on the last hitcircle of a beatmap), I accidentally found other areas of osu! I could optimise. This ended in a rewrite of texture handling which means that when running in Direct X mode, osu! no longer needs to store a copy of textures in RAM when they are already sent to video memory. This resulted in ~33% decrease in memory consumption across the board – and even more in some cases.

    The reason textures were stored in RAM was due to XNA allowing developers to be more lazy in how they choose to load textures, meaning if a graphics context is lost (window changes to another monitor, resolution is changed, full-screen toggle etc.) there is no extra code complexity to reload textures. osu! already handles this correctly and having XNA still managing textures was completely useless. I was very happy to find this one, as it was a quick win which will help users with low-spec machines across the board!

  • I found and fixed a memory leak occurring when loading custom hit-sounds. After loading any sample, the sample was being freed using an incorrect call, leaving some resources hanging on to invalid memory references. This may have caused crashes or excessive memory usage after several hours of playing maps using custom samples.

  • Localisations for osu! have been updated to use the new Google Sheet localisation method. This should allow for very speedy changes and additions to localisations. A total of 26 languages are now available for selection in osu!.

    Every time I build a new public release of osu!, new translations are automatically pulled from the shared spreadsheet. I do take the time to look over changes and make sure there have not been any malicious modifications/additions (since it is publicly editable) before pushing them out to users. That said, localisations are not tied to individual osu! builds, and will be updated automatically when available.

    The code to integrate reading from google docs was handily borrowed form osu!stream, where it has proven to be a very efficient method of localising in-game text. I do plan on adding more localisable text in the near future, including eventual editor support.

That’s all for this week! Let me know if you want less or more detail; if you want to know about different aspects I have failed to cover; even just that you enjoyed this post :).

comments

this week in osu!

published 16 Nov 2012

I've decided to start this forum as a place to discuss what has happened over the last week in osu!. I'm taking this idea from the Webkit/Chromium developers who use a similar approach to sum up changes at a higher level than the raw changelog, which may not reflect where time has been spent over the week. A lot goes on behind the scenes of osu! and I'd love to share more of this with you guys :).

Here goes!

  • osu!mania ranking was reset, and an initial implementation of pp is present. I still want to tweak these calculations, as they are working with much smaller numbers than I am used to when balancing the pp system.
  • I have been working with a new test environment containing 360gb of beatmaps (over 100,000 difficulties) in order to stress test various parts of osu!. My main focus here is on the song select screen, which suffers from decreased frame rates with higher beatmap counts, as well as trying to replicate occasional lag spikes that occur for some users during play mode. As a result, I have added the experimental ability to store songs in a non-standard song folder (add a BeatmapDirectory key in your config file containing the absolute path). No support will be given for users attempting this. Keep in mind it will cause a full process when switching paths.
  • I realised I had not yet switched the main osu! web server across to php5.4 (it was still on 5.3.x), even though I had extensively tested against 5.4, so I decided to make the switch on Wednesday. Due to an incorrect configuration file and lack of error messaging at php's end, there was a 15 minute downtime during this switch where the website and score submission were not available (see http://stat.ppy.sh/322866/2012/11). Easily avoidable, but nothing too major, and during off-peak times.
  • While we already have the test/sandbox server ha.ppy.sh, I completed the last few steps required to allow deploying a dev server on a local machine for the osu! website yesterday. This means with a spare ubuntu VM and around 15-20minutes to set things up, I can have a fully working osu! website locally, for even faster previewing of changes. This should speed up development significantly in some specific cases.
  • We hit a new high with 8,415 users connected on the 11th!
  • I commissioned a new server for bancho in order to keep up with bandwidth requirements from the ever-increasing user count (especially at peak times). This is mainly to take advantage of new offers available at the datacentre and as a result, keep servers affordable. Datacentres generally make their money based on the fact that most people don't bother migrating servers from deprecated plans, meaning you are paying old rates which usually decrease over time. By spending a bit of time and recycling old server, it is very possible to save big, as in this case, where I was able to double my bandwidth allowance without increasing base costs. Migration to the new server will happen early next week and no downtime will be necessary.
  • I decided to increase the BSS upload allowance by 1~2 maps across the board. Go submit those maps you have been waiting to for a while :).
  • The osu! world cup is finalised and ready to start in just over a week's time!
I'll be posting these things once a week on Fridays, so follow with your rss readers or whatever works best. Feel free to post comments if you have questions or suggestions!
comments

developing an allergy to apples

published 25 Oct 2011

Recently I have been heavily involved in Apple platform development -- specifically of the mobile variety. This has involved launching and maintaining a prominent e-book reader (with a user base of 500k+) alongside my personal projects' apps (puush and osu!stream). While I generally try to see the best of any situation, developing for iOS has fueled stress and anger that I am not used to, and even made me consider at times that changing profession wouldn't be such a bad thing.

And don't get me wrong, I love coding. I live for code. So something must be drastically out of balance here, right? While some of the issues I have are specific to my own workflow and methodology, I think most of you will be able to agree on some level that things could definitely be greener in the Apple ecosystem. I'm going to touch on a few specific areas of iOS app development over the last couple of years which have hampered my productivity. Hopefully it will serve as a light warning to those looking at entering this arena.

Apple always has the last say

Since releasing our e-book reader, we have had to remove over half the functionality in order to keep it live on the store. After Apple decided to enter the e-book market, it was initially unclear as to how we would be affected; I knew there would be a drop in sales as Apple leverages their App Store and OS strongholds to promote their new app and store, but they did not stop at just that.

In a manner that I would personally say is many times worse than the age-old Microsoft anti-trust case, Apple gradually changed their policies to lock competitor e-book providers out of their system. It is obvious that they were not happy having customers using their devices to purchase books via other providers (even though all fulfillment was done via an external web link), and gave us a deadline to make the transition to In-App Purchases. This would mean giving a 30% revenue margin to Apple.

While reluctant to take this approach, we did end up attempting to embrace Apple's restrictions. Sadly, the publishing industry is a very complex one, and payments and pricing could simply not be moulded into Apple's tier-based pricing system. This would not allow for regional pricing, state-based tax, 100k+ product listings and rapid price changes, to name a few problematic points. Of course, Apple would have understood this from the beginning -- having to abide to the same contractual terms with publishers themselves when distributing via iBooks -- and pushed the deadline forward further while they planned the road forward for their competitors.

Then, along with Amazon and other key e-book providers, we were given an ultimatum: remove the store completely from the app. "Okay", we said, and replaced the in-app store with a link to our website. This was not enough to please apple; within a day our app submission was rejected, and what followed were a few months of back-and-forward build rejections. The end result wasthe removal of every UIWebView from our application, along with every external link. Yes, we could no longer provide even a web link taking customers to our company's own web site (not even a "forgotten password" link!).

The end result is a thin-client reader which can do not a lot more than read and sync books purchased outside the iOS environment. Potential customers cannot make a purchase unless they manually make the connection via our app and our website.

Too late to apologise

When I initially made osu!stream available, I ran into some licensing complications with the music I was using, resulting in the need to quickly pull the app (of my own accord) and replace the songs. A month later, I submitted an update to the app, which amongst many other improvements, mapped the previous song packs to newer ones. This was a way of ensuring users which bought the initial song packs didn't feel cheated after they were made inaccessible.

To reach the maximum amount of users possible with instructions on how to retrieve the new content (and providing a brief explanation of the events that took place), I wrote up release notes which had a paragraph dedicated to the licensing issues. In this paragraph, I apologised for what happened. This update got rejected on basis that release notes were "not related to the changes in the app".

When Apple reject an update, they generally do not give any specific instructions, instead only quoting the relevant sections from the Terms of Service passage and leaving the rest to you to figure out. I had a hunch that the word "sorry" triggered alarm bells, so changed only that sentence; accordingly, it was approved one day later (n.b. they did actually check each localisation, so it took two shots!).

While this might not seem like a huge deal, it is just another example of the kind of randomly exerted control they have over the approval process.

As agile as a snail

One of the more apparent downsides of iOS development is the compulsory approval process associated with deploying an app to the App Store. Depending on seasonal demand, this approval process can take anywhere between 6 hours and 14 days (based on my experiences to date). The lower end of the scale is generally only seen when it is in Apple's interest to get it live quickly.

I usually work with a very rapid release cycle, usually aiming for daily releases -- pushing features and fixes the moment they are mature enough. At times this means relying on the end-user to do the "testing", but I find this to be the most productive release method. Keeping things fresh generally keeps people happy and coming back for more. Maybe I am designing for myself here, but if I was a user, I would enjoy seeing my feedback rapidly incorporated in a product.

Being limited by external forces to an average of weekly releases -- with no rapid user feedback in the interim -- makes development very staggered and unproductive for me. This is likely amplified due to the way I work, so I would be interested to hear other people's experiences with the imposed time-between-release limitation.

You might say "but you can deploy to testers as often as you want via a rapid deployment process (testflight is awesome)!", to which I would answer that I am definitely aware of this and actively deploying regular builds using these methods. The main problem here is that each developer account is limited to 100 registered devices for device provisioning, and while this may sound like a large number, I found that a month or two after accepting testers over 80% of them were inactive. You are only given one opportunity each year to remove a number of devices (at the anniversary of your account registration), which is both cumbersome and sub-optimal.

There are also many documented examples of extreme cases where apps get stuck in a pending state with absolutely no response from Apple. 2Do is one current example, who seem to be stepping in dangerous territory due to their iCloud's CalDAV sync. I have had some personal experience in trying to get a response from Apple's chain-of-command via phone and email and can attest that it is not so easy. Lack of control over your own product's release cycle -- and a communication blackout from those in control -- is indeed quite a scary concept.

Think Different, or Don't Think at all?

The final straw (which actually triggered the writing of this article) was having my app rejected today for storing downloadable content in the application's "Documents" folder. Since the introduction of iCloud, the guidelines have been changed to only allow "user-created" documents to be stored in this specific folder. This wouldn't be such a bad thing if it wasn't the only non-volatile storage area available to developers.

In short, this means that if myself and other developers in a similar predicament are to abide with the new terms, iOS can choose to delete all downloaded content in any app it sees fit to free up memory. Users are not given the option to disable this behaviour -- let alone warned about it when it happens. This is a support nightmare for app developers, who users are going to obliviously blame for the loss of their files.

The new "cleaning" behaviour (more details in this article) is going to surprise many users over the coming months as their files disappear at iOS' whim. In osu!'s case, this means all purchased songs would disappear, requiring special support at an application level to track when files have disappeared, alert users and trigger a fresh (and still quite awkward) in-app purchase process to re-retrieve the content. With a bit of luck, the user will have an internet connection available at this point, but what happens if they are in the middle of a long flight, or in an area without coverage?

I have been following this behavioural issue since it was discovered earlier this month, but didn't think I would be affected -- after all I am only storing around 10mb (at most) to the Documents folder. It would seem that for whatever reason, Apple is not only actively enforcing this new clause, but cracking down on it hard.

Waiting a week for approval to only be rejected is quite a bad feeling (especially when you have other coinciding release details planned), but when the rejection is due to this kind of issue it really makes you wonder if Apple know what they are doing. And makes me ask myself whether I really want to continue developing for a platform this unstable.

--

Hopefully this didn't come out as too much of a rant -- I do always try to look at things from a neutral perspective. I won't be stopping iOS development, but may take a few days off to regain some motivation :). I am not changing my app's behaviour to meet the new clause requirements and have instead filed an appeal with Apple. I hope there is enough velocity from my appeal, along with other affected developers/users' voices to change the way things work.

comments