Installing SteamOS Beta

Valve recently released SteamOS into the wild in beta form and as soon as I could I downloaded it and got it installed on my low to mid range gaming PC.  My motherboard, CPU and RAM are newer but the GPU is a bit on the older and inexpensive side.  That said, it is very capable of running SteamOS and I imagine anyone who has built a gaming rig in the last few years will be able to run it.

Step One: Have a PC

Step One: Have a PC

Actual hardware requirements for SteamOS aren’t outlandish requiring any 64bit capable Intel/AMD CPU, 4GB memory and at least a 500GB HD.   The harder requirement is that the motherboard must support UEFI booting.  There are work arounds to this requirement but it’s beyond the scope of this post. 

Lets get started with the installation. I’m looking at this from the perspective of a Mac user to many of the tools to get things prepared are Mac based.  Here’s what you need:

  • PC meeting the above requirements
  • Flash drive larger than 1GB that can get partitioned/formatted
  • An empty hard drive or a really good backup of your current system
  • A machine to download and prepare the installer with, I’m using a Mac with OS X
  • The SteamOS installation files (http://store.steampowered.com/steamos/download/?ver=custom)

Preparing the flash drive

Insert the flash drive into an available USB port and start up Disk Utility.  Click once on your flash drive (it’ll be listed on the left side) and then click the Partition button.  From the partition layout menu choose  1 Partition.  Name the drive if you wish and ensure that the Format is set to MS-DOS (FAT).  Last, double check that the partition type is Master Boot Record by clicking on the Options… button.  Use the following screenshots as a reference:

Create 1 partition formatted in MS-DOS (FAT)

Create 1 partition formatted in MS-DOS (FAT)

 

Ensure the partition type is Master Boot Record

Ensure the partition type is Master Boot Record

 

When you are satisfied with the parameters click the Apply button and finally the Partition button.  Your flash drive is now ready to copy the SteamOS installation files to.

Copying files to the flash drive

Download the installer files and extract them if they aren’t already. I’m going to assume that the files were extracted into your Downloads directory.  If not, then you’ll need to adjust the paths used in the next command.  To copy the files to the flash drive I used rsync in terminal.  This ensures the files are copied including any hidden files. Use the following command:

rsync -av ~/Downloads/SteamOSInstaller/ /Volumes/UNTITLED\ 1/

Remember to adjust any paths depending on how you named your flash drive or if you didn’t extract the SteamOS Installer files in your Downloads directory. Also note that there IS a trailing slash on the SteamOSInstaller directory.  This is important!

Copying files to the flash drive can take some time

Copying files to the flash drive can take some time

Press enter and allow the rsync operation to finish.  Once done, eject the flash drive using Finder.

Prepare the target PC

To prepare my PC for SteamOS I unplugged all of my internal drives.  This absolutely vital if you don’t want to risk having your existing hard drives wiped clean! I happen to have an extra 500GB drive sitting around for this project and if you don’t or don’t have your PC backed up, then stop here because you’re about to lose everything.

If I haven’t scared you off, then we can continue.

In my BIOS I ensured that all of the UEFI boot options were either enabled or would occur first.  This step is going to be different on each motherboard so you’ll need to play around to make sure things are right.  Basically:

  • Ensure the system will boot using UEFI at all
  • Ensure that UEFI booting is enabled for USB ports and USB flash drives
  • Ensure the UEFI is used before any legacy option

Insert the flash drive into a USB port on your computer and boot it.  Enter your motherboards boot menu if you have one or set your system to boot from USB first.  My system allows me to bring up a boot menu and I pick the EUFI USB Hard Drive option:

Boot menu

Boot menu

You should then see the following:

I picked Express Install

I picked Automated Install

Pick Automated Install and the first phase of the SteamOS installation will get started.  You’ll be looking at a lot of these for awhile:

Progress bars

Progress bars

Allow this phase to complete.  Eventually you’ll be told the system is going to reboot. When it does, remove the flash drive. SteamOS will then reboot to a standard login screen.

Initial Configuration

At this point you are faced with a standard login screen.  Do the following:

  1. Enter steam for the username, press enter
  2. Enter steam for the password, press enter
  3. Click Activities in the upper left
  4. Click Applications
  5. Click Terminal
  6. Type in steam and press enter
  7. Accept the EULA
  8. When finished, log out
    1. Click steam in the upper right
    2. Click logout
    3. Click the logout button

Here are some screenshots for reference:

The login screen

The login screen

Starting terminal

Starting terminal

 

You must now login as the desktop user by doing the following:

  1. Enter desktop for the username, press enter
  2. Enter desktop for the password, press enter
  3. Click Activities in the upper left
  4. Click Applications
  5. Click Terminal
  6. Type ./post_logon.sh, press enter
  7. Enter in desktop as the password, press enter

The system will now perform a number of post phase 1 install routines and then reboot.  After rebooting the system will create the system restore partition. You simply answer yes to a question and the rest is automated.

Just say yes

Just say yes

Pick reboot and press enter

Pick reboot and press enter

Once completed the system will reboot again into SteamOS and finally into Big Picture Mode where you can create or log into your Steam account. The initial boot up can take some time so be patient.  You are now ready to go!

These are interesting times for gaming

The next couple of years are going to be interesting in the gaming space. We’ve got a new but relatively underpowered console from Nintendo, Sony has announced what is, on paper anyway, an incredibly powerful and connected gaming experience and everyone expects Microsoft to announce something by the end of the year. Nintendo’s 3DS is rebounding nicely from a slow start as is Sony’s Vita.

On the opposite end of the spectrum we have game makers and casual game players increasingly concentrating on the mobile space where specs are fairly low and development costs much more manageable.  Mobile games have eaten away at handheld gaming console market share by giving casual gamers exactly what they want. Something to pass the time without costing much on a device they carry with them everywhere.  Who wants to bring a dedicated handheld console that costs as much as an on contract phone with games that $30 and up?

In 2013 home consoles are starting to get a bit long in the tooth and all the major players are just about ready to get their next console in consumer hands.  If we define console generations by the time they were released then Nintendo is the first of the major players to release a next generation home console to follow up the Wii, Playstation 3 and Xbox 360 generation.  Released in 2012, Nintendo’s Wii U Nintendo’s latest offering with HD graphics, a tablet based controller, improved online functionality and maintains compatibility with the old motion based Wii controllers.

It’s nearly everything the Wii should have been in 2006.

When the Wii launched in 2006 it was the only console not to offer HD graphics or elaborate online functionality.  Nintendo didn’t believe that most consumers had HD flat panel TVs, needed or even wanted HD graphics and failed to understand the growing importance and expectation of competitive and cooperative online based play and matchmaking. By nearly all measures it was merely adequate at the time but years later woefully out of date.  While the Wii was a roaring success in terms of sales it did little to push gaming forward as an art form contrary to what a number of game studios attempt to do each and every generation.

Nintendo saw great success initially delivering their usual mix of first party titles but due to, I believe, the underpowered nature of the Wii preventing game developers from creating the same kind of cutting edge games they were making for competing consoles, the Wii became a casual and party game kiosk.  Many 3rd party developers skipped the Wii when creating their latest AAA titles because it didn’t offer the power and online capabilities of its competitors.  The Wii’s major innovation in 2006 was also its biggest achilles heal, the unorthodox motion based control scheme which also seemed to turn off developers.  By 2009 Nintendo’s Wii sales were starting to fall and they chalked it up to competition from the Playstation and the Xbox while ignoring the growing threat of mobile gaming.  By 2009 the Wii’s reputation as a casual gamers paradise was well established but with the introduction of Apple’s iPhone, iPod and (especially) the iPad along with Android based phones casual gamers had a plethora of casual games that cost far less while looking just as good if not better than anything the Wii could do.

At the time, Nintendo needed a hit and they took a chance by creating a console that didn’t try to compete with other consoles with performance.  Instead they went after the market with an innovative and accessible control scheme at a price lower than any of the other consoles could compete with.  Looking back, Nintendo made the right move but they reacted too slowly to the changing market and didn’t think long term enough.  In the end, their attempt to grab and grow the casual market that was so successful initially was also their down fall.  It turns out that when given a choice, casual gamers will go with ever cheaper offerings.

So here we are in 2013 and it seems that Nintendo is attempting to do it again.  They’ve released a new console that is basically adequate for today but will not be able to compete from a performance standpoint with consoles from Sony and Microsoft years from now.  Developers are already dismissing Wii U as underpowered and not worth their time.  So does this mean Wii U and Nintendo are doomed?

Maybe not.

There are a few things that might work out in Nintendo’s favor in this generation.  For one, game development costs are beginning to skyrocket.  It turns out that in order to push gaming forward and to make games for these next generation consoles it takes incredible financial resources. The cost of creating games for true, next generation hardware could simply surpass what game studios can recoup and it is this potential problem that could make Wii U a more attractive console.  If game developers can deliver games that are even slightly better than “current generation” while keeping costs down it could be a big win for Nintendo.

Another recent shift in gaming that could be helpful for Nintendo is the renewed interest in indie level games.  Nintendo has been doing everything it can to court indie developers and encourage them to release their titles on Wii U.  The tablet based controller also makes it easier for developers of mobile games to port their game to Wii U and keep a similar playing style, despite Nintendo’s biggest oversight by not including a multitouch capacitive display.

While the next Playstation and Xbox console will no doubt also have many of the same indie offerings but the best thing Nintendo could do today is to cut the price of Wii U slightly to ensure more consoles are sold before Sony and Microsoft’s offerings are available so that Wii U has a strong foothold in the market with a device that is cheaper to develop games for and is indie friendly (read: casual gamer paradise).  Failure to do so will be the demise of Wii U this generation and may very well be Nintendo’s last chance to regain the throne it once held rather than being looked back at nostalgically.

Tomb Raider (2013) graphics settings for 2GB Nvidia 550ti

Just picked up Tomb Raider (2013) for PC and I thought I’d post the settings I’m using for others to try.  I don’t have the best card around so I attempted to dial in something the card was able to do while maintaining 30fps.  I’m running the latest Nvidia drivers as of 3/5/2013 and Nvidia has already stated that they were caught off guard so hopefully the framerate will improve with updated drivers and/or settings can be increased.

2GB 550 Ti Tomb Raider settings
 resolution: 1920x1080
 refresh: 60hz
 v-sync: double buffer
 fullscreen: on
 exclusive full: on
 display: 1
 monitor aspect auto
 quality custom,
 texture quality: ultra
 texture filter: anisotropic 4x
 hair quality: normal
 anti-aliasing: fxaa
 shadows: normal
 shadow resolution: high
 level of detail high: high
 reflections: high
 depth of field: normal
 ssao: normal
 post processing: on
 tessellation: off
 high precision: on
min: 30, max: 44.0, avg 33.2

Configuring Apache to issue the proper HTTP Link header for tent.io

One of the great things about tent.io is how it discovers where your server is.  This is important because it is possible for you to keep your tent entity URL indefinitely but change what server is actually responsible for acting on behalf of it.  http://tent.io/docs/server-protocol details how the process works so I won’t get into here.  I’m going to quickly cover how you add this header in Apache in a VirtualHost config file or .htaccess file.  If you only have access to change your .htaccess file go ahead and do so there, if you can edit your virtual host config file you can do it there as well.  The end result will be the same.

The format is the same in either the VirtualHost config file or .htaccess file.  It is simply:

<ifModule mod_headers.c>
  Header set Link "<https://controlplane.tent.is/tent/profile>; rel=\"https://tent.io/rels/profile\""
</ifModule>

Replace “https://controlplane.tent.is/tent/profile” with the location of YOUR profile.  Place the above text directly into your .htaccess or within the <VirtualHost></VirtualHost> stanza of your virtual host config file.

If you edit your virtual host config file, you’ll need to reload or restart Apache for the changes to take affect.  Keep in mind that you can’t add headers to a redirect so if you use a redirect to add www to your site address (for example) you can’t put it in the redirect stanza.

You can test the results using curl -I <hostname> or online using http://web-sniffer.net

Zelda Historia book coming to North America

So this is a fun one if you’re someone who likes the Legend of Zelda series.  This book includes original art work, character profiles and concept designs, backgrounds, history, interview with the creator and more – The Legend of Zelda 25th Anniversary Hyrule Historia Art Book

Creating “adisk” records in avahi

This post describes how to setup “adisk” records in avahi, useful for if you want to advertise that you have a time machine supporting AFP share on your Linux system.

 

http://www.nexenta.org/projects/6/wiki/AFP_with_TimeMachine

Running Solr 3.4, with multicore, on Ubuntu 10.04

My multicore Solr on Ubuntu 10.04 has proven to be one of my most popular posts yet.  Seeing the success of that post I decided it was time to show how to get the latest version of Solr up and running on Ubuntu 10.04.  As of this writing the latest version of Solr is 3.4.0.

Before we get started you should read and follow my previous post because I borrow all of the config settings from Ubuntu’s Solr 1.4 packages.  The default config settings from the Ubuntu maintainers is still a decent starting point with Solr 3.4.  Once finished you can safely remove the old Solr 1.4 package if you want to.

With a working Solr 1.4 installation in place, we can get started on getting Solr 3.4 running.  You can change some of the following paths if you want, just remember to change them in all of the appropriate places.  Everything you’re about to see should be done as the root user.

Create some required paths

mkdir /usr/local/share/solr3
mkdir /usr/local/etc/solr3
mkdir -p /usr/local/lib/solr3/data

Next, re-own the data dir to the proper user

chown -R tomcat6.tomcat6 /usr/local/lib/solr3/data

Download the latest version of Solr

You can get the latest version of Solr from http://lucene.apache.org/solr/ and extract the files into root’s home directory.

wget http://mirrors.axint.net/apache//lucene/solr/<version>/apache-solr-<version>.tgz
tar zxvf apache-solr-<version>tgz

Extract the war Solr war file

Extract the Solr war file into a location.  You may need to install the unzip utility with apt-get install unzip.

cd /usr/local/share/solr3 
unzip /root/apache-solr-<version>/dist/apache-solr-<version>.war

Install additional libs

There are a few other libs included with the Solr distribution.  You can install anything else you need, I specifically need to have the dataimporthandler add ons.

cp /root/apache-solr-3.4.0/dist/apache-solr-dataimporthandler-* WEB-INF/lib/

Configure Multicore

If you want to have multicore enabled you’ll need to perform the following actions.  The rest of this post assumes you have copied this file and will require you to make some changes to support multicore.  I’ve marked steps that can be skipped if you also wish to skip the multicore functionality.

Copy in the multicore config file:

cp /root/apache-solr-3.4.0/example/multicore/solr.xml .

You should now edit the solr.xml file at this point, doing the following:

  • Set persistent to true
  • Remove entries for core0 and core1

Next, change the ownership and permissions so that tomcat is able to modify this file when needed

chown tomcat6.tomcat6 /usr/local/share/solr3
chown tomcat6.tomcat6 /usr/local/share/solr3/solr.xml

Copy existing config files

This is where we’re going to borrow some files from Ubuntu’s Solr package maintainer.

cd /usr/local/etc/solr3
cp -av /etc/solr/* .

Because we simply copied the config files we need to modify them to fit our new environment.  Change the following in the solr-tomcat.xml file:

  • Change docBase to /usr/local/share/solr3
  • Change Environment value to /usr/local/share/solr3

Also edit tomcat.policy file changing:

  • Modify all entries referencing solr to point to appropriate /usr/local location

Change the following in conf/solrconfig.xml:

  • Change <dataDir> to /usr/local/lib/solr3/data

If you are using multicore and you followed the Solr 1.4 multicore post you’ll have a conftemplate directory and you’ll need make changes to conftemplate/solrconfig.xml

  • Change <dataDir> to /usr/local/lib/solr3/data/CORENAME

Create symlinks

Here we’ll create some symlinks to support the way Ubuntu packages Solr.  This is necessary because we copied Ubuntu’s config files and those files reference a few locations.  Creating the symlinks also allows us to continue using the scripts created in the previous post with minimal modifications.

  • cd /usr/local/share/solr3
  • ln -s /usr/local/etc/solr3/conf
  • ln -s /usr/local/etc/solr3/ /etc/solr3
  • ln -s /usr/local/lib/solr3 /var/lib/solr3

Enable/Start the new Solr instance

We can now enable our new Solr 3.4 instance in tomcat by doing the following:

cd /etc/tomcat6/Catalina/localhost
ln -s /usr/local/etc/solr3/solr-tomcat.xml solr3.xml

Note that the name of the symlink is important as it will define where we find this instance (/solr vs /solr3).  At this point you can create a new core.  I’ve provided the updated scripts here.

 

OS X Lion

I’m not going to lie.  I think OS X Lion 10.7.0 is a buggy release.  Is it buggier than some other releases of OS X?  Possibly.  Can Apple fix the bugs, most certainly.  But bugs aside, there a few design decisions Apple made that don’t seem fully baked.

First, lets touch on some of the bugs I’ve noticed so far.

Finder is one of those things in OS X that is almost universally disliked for one reason or another.  Finder in Lion has a new feature where it just stops doing things at all.  At times disk usage stops being updated and it won’t actually copy files.  While a restart of Finder resolves this issue, it’s odd that it is there at all.

Wi-Fi, formerly known as AirPort, has a strange tendency to just not connect after resuming from sleep.  That said, when it is connected I find it to be more reliable with more stable throughput.

Launchpad, the iOS like view of your installed applications has a tendency at times to lag heavily when launching an app.

There are a number of other smaller bugs that exist in Lion that are a bit grating but I have faith that Apple will fix them in short order.  Leopard was initially, at least in my opinion, unusable after the initial installation and I found myself going back to Tiger a couple of times.  Apple fixed those issues and then some.

But what really gets me are the things Apple will probably never fixed because they are working as designed and my real issue is that I don’t like the design.  Gestures for one are a cluster.  Many were changed from Snow Leopard and worse is that a good number of them contradict what a person would have learned.  Four finger swipe up now produces mission control rather than show desktop.  The show desktop gesture has now been replaced by a more awkward five finger gesture.  All in all, I spent the most time tweaking gesture settings on Lion than anything else after install.  Between the available options in System Preferences and BetterTouchTool I think I have things where I want them.

More annoying than the gestures is the addition of “natural scrolling.”  Natural scrolling reverses the scrolling direction when using the mouse wheel so that to scroll the page down you pull your fingers down on the trackpad or mouse.  The naming of this option is also interesting because unchecking the natural scrolling option says to the user they are about to enable something that is less natural.  I don’t think this could be further from the truth.  Like flying a plane, it’s natural for your body to want to push the stick forward to cause the plane to pitch down, but you push left or right to pitch left or right.  Natural scrolling makes complete sense on touch device where it is more like you are pushing a sheet of paper around.  At any rate, my issue comes in when you disable natural scrolling.  Not only does it reverse scrolling but it also reverses the direction used for changing spaces.  With natural scrolling off, using four fingers left causes you to go to the space on the left and four fingers right brings the space on the right into view.  In writing this makes sense, but in practice it feels awkward.

Lion also lacks the kind of polish I’ve come to expect from OS X.  Parts of it down right ugly.  Mail.app for example has a new layout which is great except for the hideous message count badging, shown below:

 

There is just something about the numbers that make them appear to be off in some fashion.

The boot process, at least what you see on screen, has been revamped some and I can’t help but feel that it all looks very clunky.  While the fading and moving the Apple logo from the center of the screen to above the list of users on the login screen is very clever, the steps required to move from boot splash to getting this animation setup is jarring.  The boot process basically boils down to showing the typical boot splash screen with the Apple logo which is then replaced with an image that looks the same and is ultimately used during the final animation that reveals the available users.  This transition just isn’t the kind of smooth and elegant thing a person would expect from Apple.  Couple that with the sometimes jarring color correction applied just prior to the animation effect and you have what is in my eyes a really poorly done boot sequence.  The shutdown process is also odd in that the desktop goes way and is covered with a plain gray screen.  The blue screen used in previous releases was much better and if it had to be replaced at all it should have been replaced with black.

All that said, there is a lot to like about Lion.  I find the autocorrect to be a fine addition.  I like Mission Control a lot, resume is a great feature, Mail.app’s new layout is superb and the refinements to iCal and Contacts are welcome.  I know Apple will fix the real bugs in the software but I can only hope they provide better System Preference options for customizing gestures.

I’m also surprised that none of the reviews I read seemed to point out the shortcomings of Lion and gave it glowing reviews.  As I said, there is a lot to like but it certainly isn’t perfect and I think Apple deserves to hear about it.  Lion isn’t Apple’s Vista by any means, but it’s obvious to me that Jobs had less input in this release than previous releases.

Announcing ControlPlane, context sensitive computing for OS X

ControlPlane is a fork of MarcoPolo project that I’m officially releasing today.  You can learn more about ControlPlane at http://controlplane.dustinrue.com/

OS X Lion and netatalk

I recently upgraded to Mac OS X Lion (10.7) and found that I couldn’t connect to my netatalk server anymore. Thanks to this blog post I was able to get Lion connected to my Linux based AFP server.