System.NullReferenceException

WWDC 2013 – My hopes and predictions for hardware

Tomorrow is Apple’s big annual developer event, it is mostly going to be about iOS and OS X but the rumor mill is also predicting some hardware changes in the product line up.

The Macbook Air

It seems clear that we will see Macbook Air and Macbook Pro updates to Intel’s newly released Haswell chip. I would predict that these new models will be available either immediately at WWDC or right afterwards. Intel has a good push going for Haswell and I bet they have their fabrication ducks in a line to keep yield up to allow Apple to roll these out without any of the delays that last years iMac updates saw.

The Macbook Air is already pretty much as thin as we can make them with the technology and materials currently available.

I also doubt that Apple will be able to pull off putting a Retina display in the Air models this year as much as I would love to see it happen. I doubt they can keep the battery life without having to add more and/or better batteries into the form factor. They did do this to deliver the 3. generation iPad with a Retina display but I doubt it is possible for them to do just yet. There is also the problem that it would likely significantly raise costs and likely there would be a shortage of displays which would cause rollout delays Apple are likely keen to avoid after the iMac fiasco.

I predict the Macbook Air will stay non-Retina for another year and that it will stay in the same form factor. Due to the improvements Haswell brings to the table though it will be more powerful and likely last longer on battery.

The Macbook Pro

That leaves the which the Macbook Pro to make changes to. With last years introduction of the 15“ and 13” Retina models the line up now looks very visually distinct which is not really in Apple’s usual style. I predict they may want to unify these machines over time, likely to use the 15" Retina form factor.

If Apple decides to drop the discrete graphics from their Retina Macbook Pro machines they would likely to able to push in that direction. It seems likely that they could make those models slimmer and if that is the case I bet they will take the time to unify the look and feel. We are likely see going to see non-Retina models simply to keep affordable models in the line up but it is clear that Retina is the way Apple is going and the rest of the industry is following.

Any such change would depend on Apple being willing to trade off graphics performance by replacing the current generations discrete Nvidia chip with Intel’s new Iris integrated GPU. They would save power and while the new Intel chip is good, it still is not powerful enough to compete on the same level as the chip Nvidia currently supplies them with. It is however quite close in a number of places and even surpasses it in a few.

As a result gaming performance would go down but I expect that desktop performance would be close enough to the current generation to make no real difference.

I suspect Apple will be willing to make that tradeoff, they always had a thing for making thin, light and powerful machines. With an all Intel solution they would be able to deliver such an experience across their lineup (within reason on graphics performance) while completely removing the complicated and error prone graphics chip switching which happens in the current model to save power by changing from the Nvidia to the built in Intel.

The Mac Pro

I don’t think we will see the promised new Mac Pro, unless Apple replaces the Xeon style chips they traditionally have put in these machines for Haswell chips. That would mean a lower memory limit and the loss of dual CPU sockets. I am betting that they are holding out to deliver something later this year as soon as the needed hardware is available. Apple has also hinted that the Mac Pro upgrade will be completely different, and it is likely the machine that they will be assembling in the US.

The AppleTV

There has been much debate in the community on what Apple intends to do with their AppleTV hobby project. It currently has sold 13 million units and it is in my own experience a fantastic device. I personally love the remote to death and I have caught myself enjoying the feel of it in my hand more than is healthy.

The long term play with the AppleTV is likely going to be towards braod home entertainment but it would require more powerful hardware, developer access in the form of a SDK and a new controller (if they want to encompass gaming which I believe they do).

iOS is by the numbers one of the largest gaming platforms in existence already, and one can bring the iPad and iPhone screen to the TV via AirPlay over an AppleTV. This is though encumbered by a significant delay and isn’t really suitable for gaming.

To get gaming running I suspect Apple would need to beef up the AppleTV by adding more storage and more processing power for both graphics and processor. The Ouya shows us that ARM technology can support a console, though a fairly underpowered one compared to the Xbox One and Playstation 4.

The model Ouya espouses is yearly hardware upgrades at a 99$ price mark. This is currently what the AppleTV costs and being able to entice consumers to upgrade yearly with newer and faster models is very in line with Apple’s other businesses.

This would also allow Apple to iterate towards technology that is powerful enough to drive current AAA titles without having to spend years in development of a console that will have to last 5–8 years the way competitors do.

They already have an OS that is proven to be solid, performant and has a wide selection of games, many of which could likely be scaled to work with a separate wireless controller on the Apple TV. Add to this that the AppleTV also already has excellent media handling and a mature digital distribution channel in the form of the iOS App Store. Apple has on its hands a device that already has all the powerful media partnerships that its competitors are building and it has an existing gaming ecosystem as well as the service needed to drive it in Game Center.

So I think we might see an AppleTV model with increased storage, a quad core A6 derived chip and a bluetooth connected wireless controller (which will be sold separately at say 49–79$). I think these hardware increases are likely to mean that Apple will have to raise the price, but not by much, the device on its own could be a mere 149–179$ and still net Apple a handy profit.

Such a device would be significantly less powerful than the next generation consoles like the Playstation 4 that have been announced this year but it will be quite close to the current generation consoles so it is by no means crippled.

Apple will also be able to have quite a number of launch titles if they announce it now and have availability starting in a couple of months or if they have managed to sneak models out to selected developers in the past few months who have all kept their mouth shut. This is quite possible as we have just seen with the Xbox One and Playstation 4, both of which have significant titles announced without leaks. Apple are if anything even better at containing such leaks than Microsoft and Sony so it would not surprise me if they could pull that off.

Apple have also been hiring a lot of graphics chip designers which seems to indicate that they are aiming for some specialization in that area. On a console device one could deploy such technology with fewer concerns about power usage than on the iPhone or iPad, so I suspect over time we will see Apple deploying their most powerful hardware in the AppleTV in the future and working on scaling it to their portable devices from there.

I hope that we will indeed see an AppleTV upgrade at WWDC along with the release of an SDK for developers. Current AppleTV devices will likely be left out of the new world of apps and games, but such is life. A 149–179$ range upgrade will set things right and current AppleTV owners have never been promised anything except what we have been happy with uptil now. I know I would happily jump on the wagon immediately.

I think Apple can pull this off and it would certain broadside its competitors, as well as generating a new stream of income. All the pieces are there and the execution would be fairly classic Apple. It would also be a sign that Apple still can keep a secret and stun the world.

I do not believe the rumored iTV exists, it doesn’t make much sense compared to doing a home entertainment system like this. Convincing people to upgrade a home entertainment box yearly at <200$ is a lot easier than convincing them to do the same with a 2500$ TV. I also don’t believe Apple needs to make a TV to reinvent it or finally make it useful and less buggy. My own Phillips “smart” TV is certainly not without problems and nice as it is, it is far from smart – something the addition of my AppleTV makes up for.

The only question in my mind currently is if this will be a WWDC announcement or if they will hold a separate event for it. WWDC is a developer centric event and it will be packed with other announcements. This would also be so big that it would make sense to have a separate event dedicated to it.

The heart says WWDC, the brain says separate event but in the near future.

The iWatch

I think Apple have a wrist worn device in development. I also think that Tim Cook is right when he says that it has to have broader appeal than a current generation smart watch or any of the currently available wrist devices like the FitBit Flex or the Nike+ Fuelband which are essentially single purpose display less exercise aids to get people wearing them. Only a minority of the population wears a watch these days and convincing them to put something on their wrists will require it to be sensational.

I do not think we will see such a device at WWDC. The technology required for something as ambitious as the iWatch would have to be simply doesn’t appear to be quite there yet. I think this year belongs to the AppleTV, the iWatch I think is a surprise that will have to wait till next year at least.

Systemd in GNOME, PackageKit and what GNOME as an OS really means

Is the sky is falling? Is GNOME going Linux only?

recent proposal be PulseAudio and systemd lead developer Lennart Poettering to add systemd raised concerns that GNOME might drop support for non-Linux platforms.

Rest assure this is not the aim. Lennart in follow ups to his proposal explains that systemd could be separated into a core set of interfaces which could take replacement backends that support e.g. FreeBSD so long as it implements the interfaces systemd cares about or as it was their init system. What Lennart doesn’t want is a lot of additional code in systemd as it is today to support these platforms as one of the main advantages is the simplicity and elegance obtained by relying on the functionality presented by Linux.

Why should we care about what systemd cares about?

Because systemd gives us a powerful set of tools to improve the user experience along the improvements promised and shown in performance and standardization (read Lennarts excellent series explaining systemd on his blog). With systemd we can replace some core functionality such as ConsoleKit which would allow for a smoother multi user experience.

Solving simple problems such as setting the pretty host name that gives your machine identity. Systemd strives to allow this now by standardizing on such things as where this data is stored and it what format. Fundamental assumptions about the system that will benefit the user experience.

Systemd goes beyond that, it’s interfaces provides us a set of information and functionality which we can use to make GNOME more user friendly. E.g. systemd lets us provide a smooth experience via it’s control group tracking of all processes. This allows balancing of CPU (and likely also IO) resources between applications making a system slow down more graceful and the overall experience smoother. This tracking also allows GNOME precise knowledge of these processes. data which might be used for improvements in how gnome-shell displays information to the user.

Shouldn’t we wait depending on systemd till other platforms are supported somehow?

In honesty, resources are scarce and the truth is that the vast majority of developers and users of GNOME are on Linux. We have a reference implementation now on that most used platform and replying on it’s interfaces would allow us to provide a superior user experience both short term and long term. Depending on ystemd only means depending on its interfaces and competing kernels can init systems could very well provide these interfaces as well. That effort is though on their shoulders but with apparent willingness to cooperate.

How this is analog to PackageKit longterm

Many people misunderstand PackageKit, mostly I suspect because they have had poor experiences with the default PackageKit user experience. PackageKit is not about these tools, PackageKit is about defining a common interface to talk to the package manager. This allows e.g. integration so that the system is requested to install support for missing formats if it is available. Common examples of these situations would be missing compression formats like .rar, missing codec support such as .mp3.

It is not about .deb vs. rpm, nor yum vs. apt-get!

PackageKit like systemd exist precisely to avoid those fights. The existing tools and package repos are excellent, what we care about is not replacing them but working with them in a consistent fashion. In PackageKit every package manager implements a backend which supports a common interface. In the same way that depending on systemd allows the assumption of a common set interfaces which can be used to enhance the user experience. There should be nothing technically baring an analog solution for systemd as what PackageKit has for separated backends.

But the PackageKit user interfaces are still ugly David!

That is true and it is widely agreed that the Ubuntu Software Center is a superb experience. It currently works not using apt-get directly but using an incompatible PackageKit fork aptdeamon. Porting this to PackageKit is being undertaken by Alex Eftimie under Google’s Summer of Code 2011 so fear not you shall have the same experience as always, and it will be available on any GNOME platform. Naturally depending on completeness of PackageKit backend and existence, though most major distributions are covered to some degree.

Ubuntu’s other tools such as the update experience are also aptdeamon tools and could be ported. My personal feeling would be that the better investment of resources would not be specifying GNOME3 stories for upgrades and updates in additions to the stories already told by PackageKit.

PackageKit and systemd are slow!

And I postulate to all that slow is a bug. In the case of systemd one of benefits should be performance an Lennart is already matching an Ubuntu Upstart powered 10 second boot. As I understand with patches to a standard Fedora 15 install and no LVM as I understand. PackageKit might have hard problems to solve to match what aptdeamon gives Ubuntu in terms of performance and certain features but Richard Hughes has shamed concerns before with actual hacking. I would trust him to solve this problem long term and reap the benefits of being allowed the assumptions PackageKit gives GNOME now.

GNOME as an OS is (partly) about interfaces, not defining a Linux only desktop that runs only on Thursdays if the window is open

Interfaces like PackageKit and systemd allow GNOME to solve problems and provide real improvements to the experience. The sad side effect of leveraging what the vast majority of GNOME users already have in Linux is short term that GNOME will be Linux only. Long term it is up to the competition to provide the same interfaces. This is no different than depending on Tracker or GTK+, these needed tools which provide the interfaces we need might not run on a given platform. Given resource constraints it must sadly fall upon these platforms to contribute in providing those required interfaces.

An adventure in Open Source contribution

I’m now officially a contributor to Banshee.

While I have done a lot of advocacy work and packaging, this is my first ever proper code contribution to Open Source. Coding as such never really excited me and as a result it has been some 5 or 6 years since I last sat down to understand and work on significant code. Even then I never really got deep into programming as specification and design always was more fun to me than implementation.

It all started when a friend buzzed me a presentation by Anders Hejlsberg titled The Future of C#. While I haven’t done much with .NET I have always been impressed by it as technology and I was eager to learn of what new tools would come in the future. Naturally the talk was also attractive to me because one of the features Anders demos (Compiler as a Service) as a coming post 4.0 release feature is in Mono today, something I always like to think about when people say that Mono forever will be “chasing the dragon”. Regardless, the talk got me excited about coding and was extremely entertaining to boot. So I wanted to try something, anything, and since I like Banshee but also see it crashing and being slow a lot as a daily stress tester and bug filer I decided to subject it to experimentation.

In comes the magic of .NET, Mono, and Ubuntu. In Ubuntu I found all the tools I needed, namely MonoDevelop, mono-tools and finally Gendarme. Gendarme is a really cool project that can inspect assemblies and executables according to a set of rules for such things as security, performance and even bad practices. So I decided to run Gendarme on then content of /usr/lib/banshee-1 expecting to see a few hits and probably a lot of false positives. However Gendarme returned more than 8800 issues even on medium settings, so I limited my focus to just the performance rules set.

Gendarmes issue reports have excellent documentation with examples of bad and good code as well as careful explanations, making it easy to pick a simple problem such as the one addressed with my patch. In this case we now determine these variables at compile time rather than at link time which is faster. It is safe to do and doesn’t break external assemblies as the fields are not shown outside. All of which was explained by Gendarme and confirmed on the Banshee IRC channel. Gendarme even explained how to fix the issue, it could not be easier. Bertrand Lorentz was kind enough to sign off the patch and commit it within minutes. As an example the Gendarme article on the issue type my fix addresses can be found here.

Regardless, that was yesterday. Today my Banshee is once again back to being a git build by hand which with the excellent Banshee daily repo hasn’t been required since I stopped contributing to Fedora. The reason is simple, I needed to compile test some more changes as I was reading the Banshee source code and learning. With friendly hints from the existing developer base also growing some basic understanding of what is going on.

Contribution is easy, zero to sixty even, with Mono and Banshee.

Fixing audio issues on the Acer Aspire Revo R3610 or how to update a BIOS when all you have is a Linux machine

I love my Revo but lately I discovered a small audio problem wherein I would need to reinitialize the soundcard after having been idle for a half an hour or so. On the advice of the always awesome Daniel T. Chen I checked if there was a BIOS update available, and what would you know there was. There was even one available labelled “Linux” but no changelog is present. A secondary problem is that the required flashing tool for this update only is available for DOS and Windows. However there is a workaround in making a bootable iso with FreeDOS.

This process should work for any machine

  1. Obtain the desired BIOS update from the vendor website
  2. open a terminal and change to root
  3. cd /tmp
  4. wget http://www.fdos.org/bootdisks/autogen/FDOEM.144.gz
  5. gunzip FDOEM.144.gz
  6. mkdir floppy
  7. mount -t vfat -o loop FDOEM.144 /tmp/floppy
  8. now copy the BIOS update and the flash tool(s) into the floppy directory
  9. umount floppy
  10. mkisofs -o bios.iso -b FDOEM.144 FDOEM.144

You now have an ISO image you can burn to a CD which will boot into good old DOS and allow you to update your BIOS. Not optimal but a viable solution for important updates.

And here is one I prepared earlier.

Dear Nokia

Why did you feel the need to develop the very nice looking N900, probably one of the nicest looking smartphone on the market currently and the leaving it with a battery that falls over after listening to music for a few hours. You should be ashamed of charging so much for the N900 when it is frankly a tablet with the worlds longest extension cord. Also it would be nice you realized we live in the 21th century and that designing your software around 90′s concepts needlessly creates a data silo from which I have to pry my information.

Paying for software and supporting Open Source

I am getting a little tired of the accusation getting levelled at me when debating with the anti-Mono crowd that I don’t support Freedom and that I am destroying Linux. I even once got accused of taking a paycheck from Microsoft and/or Novell, but to be clear neither company has ever paid me a dime for any work, in fact no technology company has ever given me a salery. I have taken gifts in return for work, e.g. Novell kindly gave me a copy of OpenSuSE 11.1 and a t-shirt for some bug reporting.

While this types of claims are entirely baseless slanter I think now is the time to come out and say that I love freedom. I love freedom so much that not only have I used Linux for more than a decade as my sole OS but I actively donate to projects and people that benefit our ecosystem.

I am a dues paying member of the EFF, I am a dues paying Friend of GNOME. I preordered the Yo! Frankie game to support open game development even though I never actually got around to playing it but it seemed like a very important missing piece in Open Source to cover high quality open gaming and show that it cam be done with full transparency.

When Richard Hughes asked for money to buy a color calibration thingy, even though I likely have no need for the work he will be doing using it I donated. Richard has donated much of his considerable talent, time and effort to projects such as gnome-power-manager and PackageKit, work for which I am grateful every day and he deserves my gratitude in the ways I can show it and my thanks for improving Linux.

Likewise I am a customer wth Fluendo, not because I feel I have to get around software patents since they are not currently legal in Denmark. However I feel as a software tester that I should test not just the solutions that are kindly available to me but those I advocate less fortunate people to examine. That being said I have actually found the Fluendo codec pack to solve issues present in the open solutions and having a working DVD player is great. I don’t especially enjoy the proprietary nature of these products but I know that much of the money I pay Fluendo will be put directly back into GStreamer development and advocacy for Open Formats.

On the open content side I make sure to buy documentaries and movies such as Sita Sings the Blues, Good Copy Bad Copy and the Piracy Documentary. I also have just signed up for a membership with Magnatune to support their fine service (about which I have have ranted previously).

Most important to me personally though is the time I put into bug reporting and following Linux every day.

All of this can of course be documented, but really it shouldn’t have to be. The accusation that I am destroying Linux is lowbrow attack which is beneath any reasonable argument. Consider this what should be a completely unneeded rebuttal to such claims.

So I ask you, what have you done for Open Source lately. Do you merely rant and leech or do you support with actions, words and wallet when you can?

Jono Bacon to be replaced with a small shell script

Inside sources at Canonical report to me that due to the global financial crisis and a desire to become profitable, Canonical will be replacing Jono Bacon with a shell script.

It has become increasingly apparent over the years that all Jono really does is post random crap to his blog, play bad music loudly and drop his pants in public. All except the last one can be performed today using a few lines of basic shell script at a significantly lower cost.

Follow

Get every new post delivered to your Inbox.

Join 230 other followers