We’re in Berlin next week and we’re hosting a drinkup at the Prater Garten. Come say hello and grab a drink on us. Even better, bring a friend or two.
- When: Wednesday July 24th, 19:00.
- Where: Prenzlauer Berg Kastanienalle 7 – 9
A few weeks ago Apple made some changes to its network. These changes prevented our ranks aggregation platform from getting hourly ranks consistently for some countries. After a few weeks of intense work and several major updates to our platform we’re happy to say that hourly ranks for iOS and Mac apps are now stable again.
Changing feed formats and locations are things we’ve gotten used to over the years. This time however the modifications needed on our side required changes to our core importing architecture and more scale. Both requiring a considerable amount of time and testing. More time than we normally need.
Now that ranks are stable again we want to thank you for being patient and understanding with us.
A while back we introduced the ITC status page which shows current and historic release times for daily reports from iTunes Connect. It’s proven to be quite popular among everyone who relies on iTunes Connect, so today we’re rolling out the same report for Google Play.
If you’re ever unsure why the latest Google Play data isn’t showing up in your account this is a good first place to check.
The report is available at http://appfigures.com/playstatus or the support area of the menu.
Making a website beautiful on screen is an art form. Most websites however tend to overlook what their pages might look like when printed out. Navigational elements like menus and notifications are important for a good user interface, but turn into clutter on printed paper.Print layouts solve this problem by instructing the browser to print a report differently from how it is displayed on the screen.
We now have a print layout for every report on the site so printed reports only show the elements you’d want on a page. Simple, clean, and beautiful.
Apple has recently added two new data sets to iTunes Connect, monthly and yearly. As you may have already noticed, we’ve added support for both new datasets so now we have support for the daily, weekly, monthly, yearly, and financial datasets across all sales reports.
Monthly reports are available for the last 12 months and all yearly reports are available from the beginning of time (2008).
You can easily switch to the new reports by selecting monthly or yearly from the report set menu at the top of your sales reports.
A note on financial reports: Up until this update the Monthly/Financial report set contained financial reports for iOS and Mac apps. Since the financial reports don’t align with monthly reports we’ve separated the combo dataset into two sets: monthly and Financial.
Over the last few weeks some members have been experiencing connectivity issues with Google Play. The issue, which affected less than 10% of accounts, forced members to reset the password of their Google account after syncing because Google saw our sync as a suspicious login.
No one really knows how Google decides which login attempt is suspicious or not which made resolving this a bit wee difficult. After several code changes, a new sync server, and ultimately direct help from Google though we’re happy to say that this issue has finally stopped.
If you had to re-link or re-verify your account over the last month or so we are sorry! Although this was somewhat outside of our control we feel it is our responsibility to provide smooth and issue-free syncing so we really appreciate the patience you’ve shown while we were working to resolve this.
Ariel and the appFigures Team
P.S. – We don’t believe you should have to pay for service we did not provide and so we will be crediting affected accounts within the next few days.
We recently introduced a brand new section to the blog called Data Bits. This is where we choose an interesting set of data, analyze it, and turn it into bite sized blog posts for your reading pleasure.
Last time we looked at app reviews and how many of them a typical developer can expect to have. This time we’re diving a bit deeper into the data to find out exactly what sort of message those reviews are trying to communicate.
While there are tons of reviews in almost every language, in this post we’ll be focusing specifically on ones written in English. Most text analysis tools are built around English which makes those reviews easier to analyze. It also happens to be the language we’re most familiar with.
After some internal discussion over the best way to slice the data by language, we decided to grab a slice of iOS and Mac App Store reviews from these major English-speaking countries: US, Canada, UK, Australia, and New Zealand. Our sample comes out to roughly 25 million individual reviews–more than enough to give us an idea of what’s going on in there.
Visualizing it all
Next we needed to decide how to analyze and present so much content. After much brainstorming we settled on a method which is both simple and effective: word clouds. Once all the pieces were in place, we threw all 25 million reviews into the blender and here’s what came out:
click images to enlarge
We were pretty surprised at how positive this word cloud seems to be. Being app developers ourselves, we’re quite familiar with how picky reviewers tend to get, and we assumed that reviews would have been a bit less glowing and slightly more critical. So we ran the numbers again but the same results came out. ‘Great’, ‘love’, ‘fun’, and ‘good’ are used way more often than words like ‘poor’, ‘useless’, ‘waste’, and ‘sucks’.
And that’s it… NOT
Just because a word is positive or negative on its own doesn’t mean there aren’t other words in the sentence modifying it. While evolution has fine-tuned us humans to identify such language nuances, it’s not so easy for a computer. So we started tinkering with the data to see if there’s anything clever we can do to get a better idea of the context around each word.
We started off by sectioning the reviews according to their star rating. We figured that the star rating (1 – 5) of a review is usually a good indication of its overall sentiment.
We turned to the blender once more, this time creating a cloud of words from only 5-star reviews.
Compare that with a word cloud of all 1-star reviews:
There’s a definite contrast here, showing that words like ‘love’ and ‘beautiful’ aren’t thrown around as much in very negative reviews, while words like ‘crashes’ and ‘waste’ aren’t very popular in positive ones. We did the same breakdown with star ratings 2 through 4, and, as expected, there was a gradual change in the use of positive and negative words.
Adding some color
Armed with this new information we decided to try something crazy: We’ll assign a ‘positivity’ score to each word depending on how often it appears in positive (highly rated) reviews and how often it appears in negative (low rated) reviews. We then recreated the original word cloud, this time coloring words with a high score green, those with a low score red, and everything in the middle gray.
We weren’t sure what to expect out of this experimental analysis method, but it turned out to be pretty spot-on. We were surprised at how well the algorithm does at coloring words with a negative connotation (such as ‘crashes’, ‘waste’, and ‘useless’) red, while highlighting the positive ones (like ‘great’, ‘love’, and ‘good’) in green.
So it looks like what we suspected originally about the critical and picky reviewer was wrong, and that the first word cloud above was pretty telling on its own: there are way more positive things being said about iOS and Mac apps than negative. Who would have thought?
Update (3/18 6pm est): It looks like iTunes Connect has been fixed. The spikes have been removed and the downloads now match the raw reports.
Earlier today we started receiving reports about differences in data between iTunes Connect and appFigures. All the reports had the same pattern: iTC is showing more downloads than appFigures, and the data shown in iTC for yesterday (3/18) is significantly higher than expected.
After reviewing multiple reports and comparing raw data from the appFigures archive to raw data in iTC it seems there is no discrepancy at all in the raw numbers. When we then compare these numbers to what’s displayed in our iTC graph, it appears that what Apple’s showing is too high. This makes us think the issue stems from iTC. The raw report Apple provides, which is what we import into your account, just doesn’t match the graphs or tables in the iTC dashboard.
You can double check this by going into your iTC account’s Sales page, downloading the raw reports and adding up the units manually, then comparing to the graphs in the iTC dashboard.
So if you were really excited about today’s spike in sales we’re sorry to be the bearers of bad news but it’s probably just a mistake on Apple’s side
We believe that the sales data currently in your appFigures account is the correct version, but we can’t know for sure until there’s word from Apple. Either way we’ll make sure that your appFigures account is in sync.
FYI – We reached out to Apple about this issue and will update this post when/if we get a response.
Not too long ago we built a completely new way of tracking app reviews by caching data for speedy full-text search and advanced filtering. That means we’re now downloading every single review for more and more apps every day in addition to the aggregate review data we already have. As you can imagine, that data occupies a lot of bits on our servers, but it also leaves us with tons of juicy data to analyze. In the coming weeks and months we’ll be writing several blog posts analyzing this data set and others.
It’s no secret that a few popular apps get a lot of reviews while most don’t even come close. We wanted to quantify this sentiment and figure out how many developers get how many reviews. We’re happy to share this information in the hope that it gives developers some insight into how their portfolios compare with those of other developers.
Our data set
There’s a lot of app review data out there from numerous countries and several stores, so for the sake of keeping this post short we decided to focus on a pretty specific slice of data, containing only iOS apps. Our sample consists of over 70 million aggregated reviews from 151 countries, which is enough to show some interesting and meaningful data.
Looking at this chart, three interesting points immediately jumped out at us:
- A typical iOS developer can expect between 1 – 5,000 reviews for their entire portfolio (a pretty huge range if we may say so).
- Half of developers have more than 500 reviews.
- 5% of developers don’t have a single review for any of their apps (not even from mom!).
Also, the chart has three fairly equal slices inside the 1 – 5,000 range. This means that an almost-equal number of developers fall into the 1 – 100, 101 – 500, and 1,000 – 5,000 ranges. In which camp do you belong?
Looking at the extremes
A lot of developers would agree that a very large portion of reviews is held by a small portion of developers (that group in the small green slices in the above chart). Again, we wanted to quantify this amount, so we took various-sized chunks of the developers with the most reviews and saw how many reviews they held (out of total reviews).
So it looks like a quarter of reviews are held by just one tenth of a percent of all developers. These are the developers in the extremes of our previous chart. In this case, the extremes are huge, with some developers having portfolios with millions of reviews.
This is all we have the time for today but there are still some unanswered questions: what types of apps are those 0.1% most-reviewed developers creating? Do specific genres or prices of apps draw more feedback than others? Are these reviews largely positive or negative? Next time we’ll explore some of these questions and more.
Almost everything we do is inspired by you, our members and readers, so please share your thoughts on this blog, twitter, or any of the other virtual hangout places out there.
We’ve heard from a number of our members in the last hour or so, informing us that Apple has replaced their data for February 13, 2013 because the originally published daily report may have been incomplete.
In order to ensure your reports are as accurate as possible we recommend replacing the report for 2/13. We’re automatically replacing reports in the background right now for all members who have auto-import enabled. If you don’t have auto-import on in your account, you should manually replace the report by following the steps below:
Let us know if you have any questions or need additional help with getting your data updated.