Rose Williams

About me

I am a lecturer at Binghamton University currently teaching introductory Python to students new to programming and Software Engineering to computer science majors.
Previous courses taught include Programming with Objects in Java, Data Structures in C++ and Introduction to Computer Science.

I taught 'classic' Software Engineering for many years when, a few years ago, I became intrigued by Agile Software Engineering. Although many had panned it as too radical or 'Software Engineering Lite', to me it seemed like a natural development: when you carefully prune away the bloat and overgrowth from the 'classic' methodology down to its BEST practices, the discipline so uncovered is Agile! Since this realization, I have switched my Software Engineering curriculum to Agile, which I believe will better prepare my students for the rigors of real world software development.

Prior to joining the university, I was a Software and Systems Engineer for Link Flight Simulation working in the areas of Tactics/Avionics, Visual Database and Radar Simulation. Projects included simulators for the AH-64 Apache Helicopter, the B-2 Stealth Bomber, the F-16 Fighter Jet, and other rotary wing military aircraft.

One of the interesting things about working at Link was that even though our process model was Plan and Document, some of our practices could be considered embryonic forays into practices that would later be incorporated into agile. Unlike many other companies, we followed a 'womb to tomb' style of development. Once you were assigned a system, you typically took it from concept through acceptance and delivery: we were responsible for our own design, documentation, implementation and testing. During the early phases of a project, as soon as requirements were in place, we would write our acceptance tests. When we were designing our various systems down to the units, we were expected to write our test procedures. Once we had implemented some code, we tested it ourselves. System integration was a team sport, and we all participated. Whereas in many companies it was customary to 'throw the code over the wall to QA', our QA people didn't carry out our tests, but rather checked and verified them, along with all of our code and documentation.

Among the many experiences I had there, a few stand out. As typical for a plan and document project, our development of the prototype AH-64 simulator was behind schedule. So our company made an agreement with the Army to allow their personnel to train on the simulator during the day (for about a year) while we finished its development. At the end of each Army training session, they would meet with the Project Leads and Managers to report on what was working, what they liked, what wasn't working, what they didn't like, etc. If there was an issue with your system, you would meet with them and discuss it. A new load was built every night with the modifications made as a result of the meeting from the previous (not current) day. The next day, a number of us would gather at 4AM to test the new load, then run missions to test as many of the completed systems as possible. At 8AM we would turn the simulator over to the Army personnel, meet to discuss further action items, and work on our respective development and action items from the previous evening's meeting. After the Army finished for the day, turned the simulator back to us, met with us, etc. we again had to submit all of our mods in time for the next load build. Although our work flow wasn't necessarily that unusual, interacting with our client on a daily basis was! Typically, client interactions only happened during requirements elicitation, critical reviews and acceptance. I believe that these interactions ultimately helped us to develop a superior product, with a much greater understanding of and match to their requirements. Although our project was behind schedule and over budget, upon completion it was heralded as the most sophisticated simulator ever produced at that time. It won the 1985 Daedalion Weapon Systems Award, and was the first simulator to do so.

Other experiences run the gamut from the instructive (e.g., learning the importance of understanding the client domain during the development of the Saudi Arabian and Egyptian visual databases) to the terrifying (e.g., sustaining a serious injury when the AH-64 motion-system went out of control). All, however, have emphasized the importance of creating GOOD software, and having a Software Engineering process model that nurtures its development.

Skills & Technologies

  • BDD
  • TDD
  • Rails
  • agile
  • Ruby

Contributions (GitHub) - 1 total commits x 1 - 1

Contributions (Hangouts Hosted) - 2 total hangouts x 1 - 2

Contributions (Hangouts Attended) - 2 total hangouts x 1 - 2

Contributions (Authentications) - 2 authentications x 100 - 200

Contributions (Profile Completeness) - 10 out of 10

Contributions (Membership Length) - 6 out of 6

Contributions (Sign In Activity) - 6 out of 6

AutoGraders Client Meeting (BetaSaaSers)

Latest pairing videos by Rose Williams

AutoGraders Client Meeting (BetaSaaSers) 17:23 10/02
video unavailable ('Start Broadcast' not pressed, or Hangout/YouTube fail) 17:36 11/02
PairProgramming on Autograders 17:29 04/02