Beki Grinter

Posts Tagged ‘computer science’


In academia, computer science, discipline, research on October 27, 2011 at 3:09 pm

I just read Geoff Nunberg’s perspective on Steve Jobs and iSchools. It was the first time I’d ever seen the i in iPad connected to the i in iSchool, but it is of course really to connect the liberal arts and technology. As Nunberg says:

It [contemporary technology trends] isn’t just about computer science anymore, either. That isn’t where you go to find out how technology changes people’s lives, and where it fails them, or how to make it less intrusive and more humane. Those are the questions people are taking up at the Schools of Information that have sprung up at research universities like UCLA, Toronto and Washington — iSchools, for short. It’s a different i-, but it too stands in for a connection between technology and the social world.

Striking to me is the first two lines: that Computer Science isn’t where you go to find out about the relationship between people and technology. That’s a strong statement, and not one that I either agree with or do I think characterizes some of the work that does go on within Computer Science departments. But I think it is fair to say that its the less commonly travelled road for Computer Science today, and that’s a choice that I think the discipline and its practitioners may want to revisit in the face of the future.


A Name for Computer Science

In academia, computer science, discipline on March 19, 2010 at 10:24 am

This post continues a series of reflections on the discipline that is sometimes known as Computer Science.

I while ago I wrote about some of the naming conventions that we use in our discipline. One is to name around function: networking, security, operating systems. In this we choose a function of the machine itself, suggesting a separation of parts. I can’t help thinking that this separates theory from practice. Because in theory these are distinct, but in practice all these things must work together. And in practice there are collaborations across the fields. Another naming scheme we use emphasizes the greatness of the machine or its complexity. I think of high-performance computing and many core computing as two examples. Rhetorically we choose abundance over scarcity.

But naming also seems to be part of our disciplinary discussion right now.

One question turns on Computing or Computer Science. It is, for example, the “Computing Community Consortium” that’s an entity designed to promote, well Computing or Computer Science? The College of Computing uses Computing distinctly from Computer Science, but I don’t know whether the CCC or the Computing Research Association sees the two as distinct. If it is a distinction, what is it?

The College has three schools: Computational Science and Engineering, Computer Science, and Interactive Computing. Like other free standing Schools (Colleges at Georgia Tech: Georgia Tech has other naming issues that are beyond the scope of this particular post). We recently became this way, and one reason I have always assumed that we separated into Schools was because the College was getting a little unweildly. I’ve always assumed that this was a solution to our increased size, and also a response to what I call the devolution of Computer Science.

But, to share another example, UC Irvine has a free-standing Computing school: the Donald Bren School of Information and Computer Sciences and it comprises three departments, Computer Science, Informatics, and Statistics. At a point, and this seems to be it, free standing schools/colleges are moving towards a structure that has internal divisions. And of course, other Universities are also working with a division, and working out what’s housed where. The University of Washington strikes me as an example, where you can find people who have a research interest in HCI who have affiliations in Computer Science and Engineering, the Information School and Human Centered Design and Engineering.

What will this be? It is clearly a work in progress, not just here at Georgia Tech, and more broadly. And as for me? I think that we should postpone the discussions about what Computer Science is until that’s decided nationally, and simultaneously we should participate in those discussions since Georgia Tech has such a stake in them.

History and Impact

In academia, academic management on March 16, 2010 at 7:33 pm

The Royal Society is celebrating it’s 350th anniversary. To coincide with a series of exhibits about scientific advances in the past, the Royal Society is also taking the opportunity to reflect on the future of British Science. Here’s an excerpt from the report.

From Faraday to the iPod

Michael Faraday was a leading light of 19th century science. He began his career as secretary to Sir Humphry Davy, himself a formidable chemist and inventor. Faraday then joined the Royal Institution, where his experiments allowed him to elucidate the principles of electromagnetism and build the first dynamo. Explaining a discovery to then Chancellor of the Exchequer William Gladstone, Faraday was asked, ’But after all, what use is it?’ He famously, but perhaps apocryphally, replied, ’Why sir, there is every probability you will be able to tax it’.

Faraday’s ideas were taken forward by James Clerk Maxwell, Lord Kelvin and numerous others, including Albert Fert and Peter Grünberg. Fert and Grünberg received the 2007 Nobel Prize in Physics for work on giant magnetoresistance, showing that tiny changes in magnetism can generate large changes in electrical resistance. Their 1988 discovery revolutionised the way that computers store information. The minuscule hard drives inside laptops and the earliest iPods would have been impossible without Faraday’s pioneering work more than 150 years earlier.

I’m not entirely against the narratives of impact that permeate the academy. We tell each other, we should strive for impact in our research, it’s more than the production of knowledge, it’s impact. But, I think Faraday provides a useful reminder: not everything that is discovered now will have its full impact in a short time frame. Perhaps impact should be treated like the history that it is, something that can not be assessed until sufficient time elapses. Early judgements likely to fall short and potentially be imbued with the personal orientations of those who make the assessments.

CS Principles

In computer science, discipline, empirical on March 13, 2010 at 2:48 pm

I really like these principles. I know that they’re focused on Advanced Placement Computer Science (this is the American notion of allowing some students to take college level courses in high school). But I think they have broader relevance, (and the content behind each of these principles could distinguish what they are used to teach). What I like is that they are organized around principles that cross-cut a set of disciplinary silos Computer Science has gotten too used too. They say nothing about HCI, Systems, (a little about) Networking, perse, but require skills in all these areas. Perhaps this is just where I work, but we seem to siloed, and to mired in discussions about what fields matter the most to Computer Science. I think these present a constructive way of moving forward, by placing an emphasis on why any of these areas matter, and blending theory with practice.

Computing is a creative human activity that engenders innovation and promotes exploration.

Of course, I’m going to like this, but I think it does a nice job of putting the human in the loop of systems production, while simultaneously also suggesting that it’s not just about building things for profit or to solve business problems. Innovation is an exploration of the new… that sounds a lot more exciting than going to work for a code shop. And it also reflects an interesting reality for Software Engineering. The hard work in Software Engineering is building the solution the first time, once built (as long as it’s not changed) it’s a matter of copying it. All the work is in the first effort, and believe me it’s a lot of problem solving exploration. I know I used to work with people who did this for a living.

Abstraction reduces information and detail to focus on concepts relevant to understanding and solving problems.

This seems to me to be a fundamental skill with computing. I think it can be read in different ways. Here are two. I read here the skills that are required during modular decomposition, to get the right solution parts from the problem statement. I also read another about turning piles of interview data, through the process of analysis, into a set of outcomes, perhaps implications for design. But, it’s always the same type of challenge, a winnowing of the data through established analytic principles and mechanisms to get to a focus on the core.

Data and information facilitate the creation of knowledge.

I think there’s a relationship between abstraction and data, abstraction is the process by which you manage the data here. Perhaps even part of the process of turning it into knowledge. But there are other ways to manage data and to generate knowledge that turn on other innovations. One that springs to my mind is data visualization. I think that’s a socio-computational system where people and machines work together to create meaning from data. This also connects Computer Science with Information Science in a really nice way. I think it’s still not quite clear, from a disciplinary stance, where the boundary of Computer Science and Information Science are here. Interestingly for Georgia Tech, it also connects Computer Science to Computational Science and Engineering.

Algorithms are tools for developing and expressing solutions to computational problems.

Recently we had a talk in which someone declared that the three most important areas of computing were algorithms, algorithms, and algorithms. Some of us think (hope) he was joking, especially when he added that people were just algorithms. But, needless to say, he’s not completely wrong, algorithms are central to a discipline of machine manipulation, since they are the mechanism of machine control. Nuff said.

Programming is a creative process that produces computational artifacts.

I’m all for creativity in programming. Frankly I think that Software Engineering has come lately to creativity, agile program seems to make room from creativity. But, earlier forms of software process management tended to over-emphasize control. I think that this was largely the product of two circumstances. First, the “software crisis” has long plagued the field. The software crisis is the difficulty of predicting how long, how expensive, and so forth any software project will be. Software Engineering has been in pursuit of better predictive methods to handle the software crisis. Second, Software Engineering was initially work done within the military context, the early big systems were military ones. And Military environments have a command and control form of management, so it was not totally surprising that that was the management style applied to software engineering. So, I think creativity is important. Now, a colleague of mine, has pointed out that programming can also lead to creative user experiences, which could be missed in the use of the word artifact, but the descriptions of what could be learnt seems to include that more. Perhaps add “and experience” onto the end of the sentence.

Digital devices, systems, and the networks that interconnect them enable and foster computational approaches to solving problems.

Networks are socio-computational systems. The technical network has given rise to applications that have enabled communication and coordination. Social media is perhaps the most recent and powerful example. That’s clear in the description. I suppose if I had a nit it would be that the human side doesn’t come out quite as strongly in the principle. I’m not sure that Facebook solves a problem perse, I think it allows us to be human to connect and communicate. And that doesn’t seem problem centered to me, it’s not business communication or whatever, rather it seems to be an essential human trait. Again I owe this to the same colleague, thanks Matthew. Perhaps I would have said, Digital devices, systems, and the networks that interconnect them enable and foster computational approaches to solving problems and support human experience. I notice that I’m just adding the word experience everywhere.

Computing enables innovation in other fields including science, social science, humanities, arts, medicine, engineering, and business.

So one possibility with what’s missing above, is addressed here. This is the area of socio-computational systems. Interestingly this principle is a hybrid. The principle focuses on fields, and that could be read as on the state of the knowledge. But in the text it’s also clear it’s on the state of the practice. Teasing the two apart could be useful. Computing is a platform for advances in other fields, my colleagues in Computational Science and Engineering are working on some of these advances for other fields of Science and Medicine. But, that’s different from changing the state of the practice, from changing business practice, from changing how we live, work and play. Computing is doing both. Although I hesitate to use the word change. I sometimes think that it’s more the case that it enables new ways to do what we always did. For example, web banking allows me to do banking from a far greater range of places and at times that physical banking would be impossible. But, I’m still using it to do banking. The institution of banking, of having checking (current) and savings accounts, of making transfers, of payments, and so forth is still pretty much the banking system I grew up with. Oddly enough I think Facebook might be more of a change agent. I would like to have kept up with people from my high school, but I couldn’t possibly do that with letters or even emails frankly. The heartbeat, the pulse of life that I get through the status updates I send and that I read, does allow me to learn a little bit about what the huge number of people I’ve interacted with over the years are doing today.

Education: those next 20 years…

In academia, computer science, research on March 5, 2010 at 9:49 am

I’m getting closer to my colleague Mark Guzdial’s intellectual turf here, but I saw two things recently that made me pause and take a moment.

The first was an update on a fluid situation, the question of how much of a budget cut the University System of Georgia will take next year. It’s still a work in progress, note that the article says that there’s another meeting on Wednesday which concluded with a still fluid situation. The list of cuts, from each University appears towards the end of the article and it makes for sobering reading. I say that as a State employee myself, one who works for Georgia Tech.

The second was the announcement that Hawaii’s schools will be moving to a four day school week. I’m not a parent, but I can imagine that this is going to be something else for the dual-income parents of Hawaii. There’s always been an alignment between the school week and the work week, not perhaps a complete one, but this is certainly going to expose more of the assumptions that come with that alignment. And that’s not even a thought about education. As the article comments, people are concerned about the educational impact on Hawaii’s children with less instruction time.

Together they paint a picture of an educational system that is in transformation. No-one is predicting that this recession will end soon. Many suggest that there will be recovery, but it will be slow. The effects of this could be as long lasting as the generation experiencing it (those in education at this time). Intriguingly (and somewhat politically I’ll observe) it is the people who are largely not in education who are deciding the fate of those who are.

Recently, Dick Lipton posted an article about education, asking could Universities become extinct in the next 25 years, one that Mark and I built on. I have to wonder whether these are just two data points, in a sea of others that set up the conditions for just such an event. If we’re optimistic, we can hope that at the end of this time we’ll look back and be able to see innovations that set up better conditions for the next generation. I’m trying to keep focused on that. And, one bright spot, the University of Michigan.