Wednesday, May 25, 2011

Live Blog from ICSE: The view of ICSE's supporters

This is my fifth live blog post from ICSE, this time from the session giving the perspectives of ICSE supporters.

I will be updating this post as the speakers talk (which is what I have done with the others). I fix spelling at the end.

The first talk in this session is The Grand Challenges of Software Engineering - A Perspective from the Trenches, by T.S. Mohan of Infosys Technologies Ltd.

The presenter first discussed a little of the history of 'grand challenges', for example NFS grand challenges in computational engineering,  putting the man on the moon in the 1960's, curing cancer (failed so far), solving Fermat's last theorem (accomplished), etc.

He also talked about grand challenges in Computer Science, such as proving that P is not equal to NP, the Turing test, and automatic translation.

He discussed Hoare's criteria for what constitutes a grand challenge. For example, it arises from scientific curiosity, has enthusiastic  support, has international scope, is a comprehensible problem, captures the imagination of the public and the esteem of scientists, was formulated long ago, still stands, goes beyond what is initially possible and requires development of new techniques. In particular, it should be rather obvious when it is achieved and should lead to a radical paradigm shift, and is not likely to be met by market forces.

He discussed a supposed grand challenge of trusted components that was set in 2003, but doesn't count since it didn't 'take off'.

When he gets to his list of challenges I will comment on each a little. I hope he includes the concept of merging modeling and programming so we can generate systems very quickly and easily.

Right now he is talking about a model that is very similar to the spiral model; he calls it the new lifecycle, but it isn't very new.

He commented about acceptance rates in ICSE, and suggested that key topics like design did not have very many accepted papers, what he said "really needs to be done". He pointed out that Mobile computing had 0 papers, I note that HCI got 0 papers too; however both these areas have specialist conferences.

Here come his grand challenges:

1. Cruising Scale: From birthing centre to hospice: A medical management system that can handle the entire medical system

2. Consistent and uniform programming tools for heterogeneous systems. This is the one I was looking for.

3. Validated trust: Verifying tools: Compilers, model checkers, configuration and deployment validators.

4. Dynamic compositionality: A global trusted marketplace for compositional software components and services

5. Simplifying complexity: Automated tools and environments for enhancing non-functional requirements

An audience member pointed out that many o these do not meet the Hoare criteria: In particular, will we know when they have been achieved? Are they long-standing?  Fair comment


The second talk is Connect and Collaborate by Judith Bishop of Microsoft Research.

The presenter presented a slide with a huge number of research areas in which MSR is involved. She discussed the group she is invovled with, Microsoft Research Connections; this links researchers to Microsoft. She pointed out that this group doesn't just do computing (e.g. it is working on the worldwide telescope), but SE is number one.

Her group gives awards to universities such as CRA Researc Awards, Faculty Fellows and conference sponsorship. She said the Jewel in the crown is the SE Innovation Foundation. The 2012 foundation awards will be in the area of mobile computing.

She discussed Pex4Fun, which she said has gone viral and The ICSE contest at http://bit.ly/icse2011contest . She also talked about the Kinectt SDK, Debugger Canvas.

Finally a colleague gave a cool demo of TouchStudio beta, a tool for writing a program easily on a phone. The tool was able to use a touch interface to put together the program. See below:



This environment sounds extremely promising. It has access to all the standard capabilities of the Windows Phone 7, including the GPS, camera, accelerometer, etc. The following, is the code that has been visually created. I know it is not legible, but it gives you a sense of what Microsoft expects end-users to be able to create. This relates to the Mary Shaw's keynote at CSEE&T that I blogged about yesterday.



It puts the pressure on Google and Apple to come up with something similar.


The final talk is How Software is Engineered at Google, by Marija Mikic-Rakic of Google.

This sounds very exciting. The room is packed.

She started by discussing how the Google search system is composed of many subsystems that all work together. She said that every Google application is growing in many dimensions: More users, more data and better quality (e.g. more relevant and faster results). She attributed many of Google's improvements to dramatic improvements in speed and capacity of hardware.

She said they need to balance simplicity, features, scalability, performance and reliability (note that simplicity is first).

She highlighted some skills needed in employees: Statistics, machine learning and production software engineering.

The design stages are: Brainstorming, investigating, discussing, prototyping, documenting (not formal, no UML), reviewing (designs available to everyone in the organization), coding and reassessing. I note that this doesn't sound very agile!

How do they deal with compexity? They ...
  • Build simple
  • Reuse existing components.
  • Consider latency and throughput
  • Parallelize when possible
  • Consider resource constraints
  • Design with scalability in mind
  • Design for reliability (assume machines will fail)
  • Do back-of-the-envelope calculations
  • Understand how data is accessed
Interesting list; seems rather ad hoc though.

She gave a list of numbers that every engineer shoudl no, such as the number of ms to send a packet from California to the Netherlands and back.

Basic truths:
  • Every new line of coe is a liability
  • No large functions of classes
  • Refactor code that has gone crufty
  • Make sure not to break the build
  • Documentation matters
  • Code needs to be localized
  • People spend more time reading code than writing it
  • Make sufficient and clear comments (update with each change)
  • Latency is a deal breaker (profile your code, then improve)
Tools Google engineers use

  • Source control
  • Bug tracking
  • An editor
  • One touch build
  • Tons of documentation
  • Standard libraries
  • Unit and other tests
  • Code reviews
Google has more than 5k developers in 50 offices; more than 2000 projects under development, more than 50000 builds every day, more than 50 million tests run a day. There are 20 code changes a minute; half the files in the codebase are changed every month.

There is one source repository and they use Perforce for source control.

Goals: Speed, high quality feedback and simplicity (there it is again).

They use a caching system to distribute source code to developers. A full checkout would take 10s of minutes. Developers change less than 10% of the code they check out.

They have done a lot of work on their build system; compilation is done in the cloud. Compiles are cached, so people who build the same things don't re-do it. The presenter commented that they need peope with strong CS skills to make this work. The cache for the build system has more than 10000 cores using > 50TB of memory. This saves 600 person years in wait time. Build outputs are not sent to user workstations until they are needed, to avoid overloading the network.

The continuous integration system analyses every check-in and runs all relevant tests (from 1 to 150000 tests). They have a storage system called Sponge that stores all the test results.





No comments:

Post a Comment

Note: Only a member of this blog may post a comment.