Ubuntu Compute and Storage Build

The goal of this machine was to replace the oldest active computer in the house. This old computer predates – by over three years – the first recorded entry on this blog for computer builds. (First build: core i7). This old computer was a combined “compute and storage” build – before I moved to dedicated compute and storage machines.

It’s main claim-to-fame is its case: a SuperMicro SC-743 4U case (now numbered as CSE-743T-665B) which sells for $320 today. Fifteen years ago it was around $200. It contains a row of 4 hot-swap high-speed fans in the middle of the case. This was the case right after my infamous “Just because a case has space for 6 hard drives, that doesn’t mean it has adequate cooling for 6 hard drives” coming of wisdom. Unfortunately, those fans howl like a jet engine, and it was time to replace it.

Some facts on the CPU: the i5-4690 is currently #191 on PassMark [7,623] cpubenchmark.net. At a price of $224, it has a “value” rating of 34.0. The Core i5-4590 has PassMark [7,224], for $180 and a “value” rating of 40.3, the highest “value” of all the i5s. Note that my “on sale” i5 was only $210. It is the 3rd fastest i5, the fastest is the #165, $320, i5-5675C@3.10GHz [8,106].

All product links are from the actual vendor.

Item Product Cost
CPU Intel Core i5-4690 Haswell Quad-Core 3.5Ghz Socket 1150 84W Intel Graphics 4600 $210
RAM G.SKILL Ripjaws X 16GB (2 x 8GB) 240-Pin DDR3 SDRAM PC3 1600 Desktop Memory Model F3-1600C9D-16GXM $80
Motherboard GIGABYTE GA-Z97N0D3G LGA 1150 Intel Z97 HDMI SATA 6Gb/s USB 3.0 Micro ATX $109
Power Supply Corsair CX750 750W 80 Plus Bronze certified, Haswell Ready $60
Video Intel HD Graphics, built in
Case Antec Three Hundred Two Gaming Case, Black $58
SSD Drive Samsung 840 Pro 256GB SATA III MZ-7PD256BW $170
HD Drive WD Black 1TB WD1003FZEX 7200 RPM 64MB cache SATA 6.0Gb/s $80
BD/DVD/CD Samsung DVD Burner 24x SATA Model SH-224BEBE $20
OS Ubuntu 14.04 64bit $0
Total $787
Posted in Computer Builds | Comments Off on Ubuntu Compute and Storage Build

Secret Share 1.4.2 on Maven Central

Secret Share in Java on Maven Central

Just completed a release of the Secret Share in Java project to Maven Central.

Search for it using search.maven.org.

GroupId: com.tiemens
ArtifactId: secretshare
Version: 1.4.2

This release features a “simplex” matrix solver implementation (Thanks to Pat J) that greatly speeds up the “combine” operation, and greatly increases the number of shares that can be handled.

Sonatype Notes:

First, run the “uploadArchives” target. Make sure all of the uploads report no error (e.g. “Transferring nnnK” matches “Uploaded nnnK”.

Second, go to the sonatype console at https://oss.sonatype.org/#stagingRepositories
The Sonatype web interface continues to be “less than optimal”. First, if that link does not display the menu item “Build Promotion” on the left hand side, you must switch browsers (to IE).

Once you can see those menu items, select “Staging Repositories”, then enter the search string. If the result line does not have a “select check box” on the left hand side of the line, then you will need to find yet another browser (or you have entered your search in the wrong mode – you must be in “Staging Repositories”, not the generic search). Once you can see that check box, select it, and details will appear in the window below. In addition, the button row, starting with “Refresh”, will now show a “Close” button. Push it. Wait a minute, navigate away from the page and then back, and when you select it it will show a “Release” button. Push that.

Posted in Software Project | Comments Off on Secret Share 1.4.2 on Maven Central

Stealth Updates and Unstable LSI Drivers

(Just FYI to everyone out there, since this is not easy to find right now.)

If you’re seeing this message:

mpt2sas0: log_info (0x31080000): original (PL), code (0x08), sub_code (0x0000)

Then you need to visit http://mycusthelp.info/LSI/_cs/AnswerDetail.aspx?&inc=8484

The short answer: Avago Tech (which acquired LSI in 2014) performed a stealth update to the P20 driver .zip artifacts on 21-MAY-2015. Re-download the P20 .zip that contains the firmware (the xxxx_P20_IR_IT_Firmware_BIOS_for_MSDOS_Windows downloads contain the firmware .bin file), and upgrade your LSI firmware to the version. Sadly, neither the .zip nor the .bin are named that way.

As related to the third item from , this is clearly a failure in dependency resolution, and why correctly labeling artifacts with versions is critical.

Posted in Ubuntu | Comments Off on Stealth Updates and Unstable LSI Drivers

Ubuntu 14.04 Unity 3D RAM

My AMD Server seemed to be running out of RAM this morning. Checking the processes, it appeared that Unity 3D was using approximately 18G/32G (i.e. with no virtual machines running, the OS was still using 18G). Why? I don’t know why Unity 3D freaked out, but ‘compiz’ was chewing up 1.5G all by itself. A quick check showed that Unity 2D is no longer available in Ubuntu 14.04.

So, I installed gnome-flashback-session.

After it was installed, and after the logout and login under Metacity, the baseline RAM footprint dropped to 1.5G total.

Posted in Ubuntu | Comments Off on Ubuntu 14.04 Unity 3D RAM

Freenas Backup Machine

The goal of this machine was to be a “small, inexpensive, bring your own HDs, standalone backup solution”.

For these purposes, that meant using a small case that still had at least 2 internal 3.5″ bays.

For Freenas, was the latest version where the .img file was available. 9.3 is available, but only as an .iso file. Another item of note when using the .img on a USB drive: booting the first time, it will appear to hang after showing “waiting up to 5 seconds for ixdiagnose to finish”. It isn’t stuck – it is just resizing your filesystem on the USB. It took mine about 9 minutes to finish this step. After the first boot completes this step, it does not stall there ever again.

Some facts on the CPU: it is currently #471 on PassMark [3,777] cpubenchmark.net. It has a “value” rating of 58.7. Intel is producing so many clones of the Xeon E5, at so many different clock speeds, that the first sub-$1000 CPU is #28 (core-i7 5930K@3.50GHz, $580). The only core-i7 that is sub-$300 is the $299 i7 4790@3.6GHz at #58 with a score of 10,105, and a “value” of 32.4. It used to be fun to get a CPU in the top 50, but it looks like that will never happen again.

All product links are from the actual vendor.

Item Product Cost
CPU Intel Pentium G3450 Haswell Dual-Core 3.4Ghz Socket 1150 53W $90
RAM Corsair Vengeance 4GB (1 x 4GB) 240-Pin DDR3 SDRAM DDR3 1600 Desktop Memory Model CMZ4GX3M1A1600C9 $44
Motherboard GIGABYTE GA-B85M-HD3 LGA 1150 Intel B85 HDMI SATA 6Gb/s USB 3.0 Micro ATX $71
Power Supply TFX 275W Power Supply, with case
Video Intel HD Graphics, built in
Case APEX DM-387 Black Steel Micro ATX Media Center / Slim HTPC Computer Case w/ ATX12V TFX 275W Power Supply $57
USB Drive Kingston Digital 8GB DataTraveler Micro USB 2.0 (DTMCK/8 GB) $6
HD Drive BYOD $50-$400
OS Freenas 64bit $0
Total $268 + drives
Posted in Computer Builds | Comments Off on Freenas Backup Machine

What is AngularJS – the key is client-side

After working with AngularJS for a couple of months now, I can finally express a concise answer to “What is AngularJS?”

It is:

  1. MVC where the model is on the client side
  2. MVC where the view is a template based in the .html, and is rendered on the client side
  3. MVC where the controller is “live” – changes to the model reflect in the template immediately

The key: “on the client side”. No more complicated mappings inside your .jsp from fields to Java objects, no more complicated mappings from “post actions” to specialized controllers that track the application state. No more painting the initial page one way with .jsp and then updates with AJAX. It replaces your .jsp template with more more natural .html with embedded template variables and controls, and keeps everything straight.

AngularJS throws in a couple of “neat tricks” – dependency injection, testability, separation of client-server, scope. But AngularJS’s two tag lines: “HTML enhanced for web apps” and “AngularJS — Superheroic JavaScript MVW Framework” — don’t provide much neither of which is very helpful.

AngularJS (or some other library that does MVC-client-side better, now that the secret is out of the bag) is the wave of the future. The productivity gains are incredible. It is literally easier to re-write your .jsp and implement that one new feature than it is just to extend your .jsp.

Posted in Software Engineering | Comments Off on What is AngularJS – the key is client-side

Amazon SDK broken dependencies

If you have received this error message:

java.lang.IllegalStateException: Unsupported cookie spec: default

It is because Amazon made their SDK dependency look like this:
+— com.amazonaws:aws-java-sdk
| +— org.apache.httpcomponents:httpclient:[4.1, 5.0) -> 4.4-beta1
| | +— org.apache.httpcomponents:httpcore:4.4-beta1

i.e. they made an open-ended statement that their SDK would work will all 4.x releases of httpclient.

As of 4.4-beta1, their statement became false. Somewhere down in the guts of httpclient, “default” is no longer a valid cookie specification, and now parts of the AWS SDK do not work.

See 2014/06/28/computer-science-hard-things/ ‎for a full essay on the problems with “dependency resolution”. In this case, Amazon just messed up – there is no way any particular aws-java-sdk release can claim compatibility with an entire 4.x line of the httpclient library.

The fix (at least in gradle; it should be similar in all build systems) is to exclude httpcomponents on the aws-java-sdk line, and then add a specific httpcomponents (e.g. 4.1 worked nicely for me, since presumably Amazon actually tested with that release before their release). Since Amazon was “fuzzy” about their actual dependency requirements, you may have to try 4.2, 4.3, etc. to make sure you get an actually-compatible-with-aws version of httpclient.

Posted in Software Engineering | Comments Off on Amazon SDK broken dependencies

Use a DSA to implement your DSL (insipired by Cucumber)

This post was inspired by The Training Wheels Came Off by Aslak Hellesøy, author of The Cucumber Book.

TL;DR – Use a Domain Specific (testing) API to implement your Domain Specific Language

That article describes the motivation behind removing web_steps.rb — in a nutshell, they were removed because these step definitions are not at the correct level of abstraction for a properly defined Cucumber .feature file. The direct quote on the subject: “Cucumber was designed to help developers with TDD at a higher level”.

The basic idea is that your .feature file should not be written like this:

Scenario: Successful login
  Given a user "Aslak" with password "xyz"
  And I am on the login page
  And I fill in "User name" with "Aslak"
  And I fill in "Password" with "xyz"
  When I press "Log in"
  Then the http status should be 200
  Then the http session cookie should not be empty

Instead, your .feature file should look like this:

Scenario: Successful login
  Given log in succeeds with a user "Aslak" with password "xyz"

Notice at this level, there is no mention of http, http status 200, cookies, buttons or button names, etc. It describes only the high-level test.

In his article, he codes to the idea in this post, but never names explicitly says it. The idea: keep your .feature definitions high-level, and implement your step definitions using a set of intermediate helper methods. This intermediate level is what I call the Domain Specific API (DSA) from my title. It looks like this:

  @Given("^log in succeeds with a user \"([^\"]*)\" with password \"([^\"]*)\"$")
  public void log_in_succeeds(String user, String password) {
     dsa.actionLogin(user, password);

In essence, the approach leads to four levels of test:

  1. .feature file
  2. step definitions implementations
  3. DSA implementations
  4. technology library (e.g. HttpClient)

The extra “Domain Specific Api” layer allows you to dive into the implementation-specific details without “polluting” your main .feature files with too many details.

Reference Links:
For Java, see Cucumber-JVM.
For Ruby, see Capybara.

Posted in Software Engineering | Comments Off on Use a DSA to implement your DSL (insipired by Cucumber)

Computer Science Hard Things

There is a popular saying about Computer Science (see here and here):

There are only two hard things in Computer Science: cache invalidation and naming things.

— Phil Karlton

There is a funny variation that makes it “There are only two hard problems in Computer Science: cache invalidation, naming things, and off-by-one errors.”

I propose there are actually three hard things:

  1. Naming things
  2. Cache invalidation
  3. Dependency resolution

My criteria for being a “hard thing”:

  1. Must be applicable to multiple scopes
  2. Must not be fully solved

Examined this way, it is interesting to see why these are the three deserve to be on the list:

  1. Naming things
    1. Applicable to every area in computer science – variable names, class names, machine names, network names, security policy names, URIs, etc. It even applies to this list: think of the difference between naming the first item “cache invalidation” versus just “caching”.
    2. Not at all solved. You can barely say we have good heuristics for this.
  2. Cache invalidation
    1. Applicable to multiple layers of computer hierarchy: CPU registers, L1, L2, L3, etc., disk caches, network resource caches, DNS caches, etc.
    2. Solved in the sense we know it is a balancing act between efficiency and correctness. Not solved for the general case, however. If there even is a “general case” at all.
  3. Dependency resolution
    1. Applicable to multiple domains: run-time (think Dependency Injection), build time (think Apache Ivy and Maven), hardware-software, distributed systems, and probably more
    2. Solved in the sense we know about topological sorting to help with transitive dependencies.
      For run-time, the entire sub-field of dependency injection has multiple solutions: Spring Framework, Guice, PicoContainer. Does anybody remember DLL Hell? That shows that “API definition” (which is a candidate for its own “Hard Thing” entry) is a sub-problem of dependency resolution.
      For build time, the better build systems make it easy to specify your dependencies and add global exclusions to get you out of transitive dependency issues.
      For hardware-software, think about the hardware requirements for running a particular application or installing a particular driver.
      For distributed systems, think about (for example) your application requires which version of which database. For provisioning, has been partially solved by Chef and Puppet and others. For detection, still very much roll-your-own.

So, did I create any converts? Do you agree there are 3 Hard Things in Computer Science?

Posted in Software Engineering | Comments Off on Computer Science Hard Things

Agile non Evolutionary Stable Strategy

Starting with a quotation I saw at a rest stop while on vacation:

Dryland farming works best when in a wet year.

This was on a placard explaining that Dryland farming had a string of successful years when it was wet. But when it got dry, the same techniques failed.

I’ll summarize that as: if you have a problem, and implement a fix, and fixes the problem, you still might not know how much your fix actually worked.

Which brings me to how Agile has “fixed” waterfall development. And whether these are just the “wet years” for Agile.

Getting back to the title – an Evolutionary Stable Strategy (ESS) describes a strategy, that if adopted by a population in a given environment, cannot be invaded by any alternative strategy that is initially rare. The first thing to note is that waterfall development is positively a non evolutionary stable strategy – Agile started out rare, but has effectively invaded (subjective judgement only – statistics are hard to find. Most of the statistics like to spout “Agile is 3 times more likely to succeed” – as if a 3% chance versus 1% chance is worth bragging about…) It is also true is that waterfall itself had previously been invaded by “hybrid waterfall” long before Agile, making it doubly non-ESS, if that is even possible. Being non-ESS is no big deal in itself.

Of interest here is why I’m claiming that Agile is non-ESS. After all, Agile is still on its upswing (again, hard to find concrete statistics here). And quite possibly, no development process is stable because of the inherent fickleness of management and their desire to chase the new fad. So, if no process is stable, then it isn’t saying very much to claim that Agile is also non-ESS.

My value-add is: I think I know the reason Agile will be successfully invaded and replaced.

I believe Agile’s replacement will come about as a result of Tim’s Rule on Agile (only highly experienced developers can make Agile work) and Choose Your Path Wisely (after years of choosing not to learn, you no longer have the option of learning). My assertion: Agile does not create developers that are sufficiently capable in executing Agile successfully. My assertion is based on observing developers with multiple years of “successful” Agile development experience, yet at the same time lacking in critical software engineering skills. Why does it matter, you ask? If they are on a successful Agile project, and have “succeeded” without those skills, then aren’t those skills by definition not needed? My answer: the skills are needed, and they are being provided by developers (or scrum masters, or business owners, etc.) with extensive non-Agile experience. And once that pool of people is gone, or stretched too thin, the Agile-only generation, who choose the Agile path, will be unable to step up and provide those critical skills (not “unwilling”, just literally “unable”). There will be much pain and frustration as the Agile consulting industry struggles to figure out what has gone wrong. And lots more finger-pointing (“You’re doing Agile wrong!”). In the end, I’m claiming that Agile is only working now because it is the “wet” years. And the “dry” years are coming.

Posted in Software Engineering | Comments Off on Agile non Evolutionary Stable Strategy