Features

NI co-founder Jeff Kodosky talks about the path ahead for LabVIEW

Known as the Father of LabVIEW by engineers and scientists worldwide, Jeff Kodosky cofounded National Instruments in 1976 and has continued to mentor the organization and pioneer graphical system design approach through NI LabVIEW.

Kodosky's invention of LabVIEW was named as one of the 'Top 50 Milestones for the Industry' and he holds 68 patents associated with LabVIEW technology. PACE Editor, Kevin Gomez caught up with Kodosky in Austin, Texas at NIWeek 2013 to get some clues on the evolution and future of LabVIEW.

Was the GPIB board the start of LabVIEW?
It was definitely the start of the company. It put us front and centre in automated test, at the intersection of computers and instruments. It gave us a lot of insight because a customer would buy our GPIB board, plug it in, connect the instruments and if it didn't work they'd call us because we're the ones that were connecting all of these things.

So we learned a lot about what customers were doing, a lot about how the instruments were working and a lot about how to diagnose and troubleshoot over the phone. The software that we'd built for our boards had a lot of extra capability to help us troubleshoot problems. We learned a lot about measurement and automation because we were right in the middle and that set the stage for our later expansion into virtual instrumentation.

How did G language evolve?
Once we were established as a GPIB supplier we got to thinking, what can we do next to simplify the task of automating measurement systems? We said, well why don't we come up with some software tools that would make it easier? We looked at a lot of different things including making libraries for C an BASIC, and basic and so on.

Went out and talked to customers, they said, yeah if you have better library of the fixed basics we'll do this. But that's not exactly what we had in mind.

We wanted to do something a little more revolutionary and a little more far reaching. So I went up to an office near campus, scratched my head a lot and pottered around. A major milestone for me was when I got introduced to the Macintosh computer. 

All of a sudden, seeing those graphics made me realise – that's the future of human computer interaction. I said, this has to be part of our solution.

We've got to be able to build virtual instruments on a computer and connect up to real instruments or connect up to the data acquisition part or whatever. We've got to have these like front panels because people could then operate it without having to read a big manual, they could experiment.

Programmers were writing basic programs to automate a few instruments and collect data, asking them to write an interactive graphical interface program would be impossible. They just wouldn't be able to do that. So how would we program a system like that? 

So I asked Dr T, give me a few more months and maybe I can figure out a way that we can use the graphics to do the programming. (Dr James Truchard is the president, ceo, and cofounder of National Instruments and often referred to as 'Dr T'.) 

I was young and naïve and didn't know that conventional wisdom said that was impossible. So I pottered around, thinking of different ways we could use graphics. Looked at dataflow, it was nice but as soon as you tried to do loops it got inordinately complex.

Looked at flow charts, there wasn't enough leverage in use of graphics. Nothing seemed to be a eureka kind of answer. I kept coming back to dataflow because of its phenomenal logical level of understanding. You could see what's going on, understand it very easily. 

So somewhere along the line it occurred to me that if we could add controlled structures from structured programing into dataflow then we would simplify the creation of loops and other kinds of case structures and so on. It would have the simplicity of dataflow and the composability of structured programming.

That was when we realised, okay this is a germ of an idea that could work. So we kicked off the LabVIEW project and the rest is history. 

How do you respond to those who prefer abstract programing languages?
I have a pragmatic realisation that there are going to be people who like to program in conventional languages.
That's what they'd like to do and so they're going to want to do that and stick with doing that. That's fine. So for those folks I'd say: do your C, your Python or whatever kind of programing you want and connect it into LabVIEW. 

Leverage LabVIEW where it makes sense and do your custom programing in your own language where that makes sense. My own philosophical view is the world is parallel and we have parallel processes running in our brains. We're doing multiple things at once all the time. We actually have to learn, take computer science courses to think sequentially. 

We study and figure out how to do that so we can try to solve a problem by doing it purely sequentially. Then we take that sequential program and we put it on a multicore computer so that we can get some parallels and some performance back. Seems a little weird.

Why wouldn't we want to think of parallel solutions right off the bat? We do that in LabVIEW, it's naturally parallel. That's why it was so easy to map to FPGAs which are the ultimate parallel prime work. So I don't get into the religious fights with programmers anymore, it's just not worth it but I think the future is going to be more and more graphical programming.

We'll have specialists who do lower level programing to do high performance routines for certain applications but the overwhelming majority of programs will end up being graphicals at some point in the future. Just like the overwhelming programing done today is sequential programing in a high level language, not an assembly. 

There are a few assembly language programmers today but most people recognise working at that level is just not worth it. We've got good enough compilers and we're willing to sacrifice a percentage of the performance of the machine to be able to work at a higher level and we do. I think the transition to graphical languages will be similar, it may take a little longer time frame and it may not appeal to everybody but that's kind of my view on where we stand.

Is mobile is the next big leap for LabVIEW?
Mobile devices need to be exploited in test and measurement and control applications. It's pretty clear that something like data dashboard, being able to view data on machines by proximity or remotely is clearly a need and that's why we're doing it. How much more and the timescale is still kind of fuzzy. We want to make sure that we're figuring out how we can use new technology and make it available to our broad base of users. 

Sometimes it's not always clear what the long term is going to be so we have research projects that we do and experiments that we run. I definitely think mobile devices are going to have a profound influence although I'm not exactly sure where it's going to be. I know what's in the HMI area but unsure how much more we will see.

Over the years we have been working on major advances that are poised to be included in the LabVIEW framework. The higher level of abstraction that we refer to is a system diagram. Basically a graphical representation of the project where you can see all the components in your system and how they're interconnected, both at a hardware configuration level as well as a software distribution level.

Looking back, is there anything you would have done differently?
Well of course if I knew what I know now back then. it could have influenced a lot of things. Although the technology we had back then is nothing like we have available today. Even if I knew that there were going to be FPGAs in the late 90s back in late 80s I wouldn't have been able to do anything about it.

I think we made different trade-offs because there's always more to do than we have resources available. If we did this before that, is it better or worse? You can't really tell so here's a simple example. In LabVIEW 4 we had a one-step undo that we finally implemented. Up until that time there was no undo for LabVIEW so if people made a mistake, tough luck. 

That was the number one requested feature for a decade but we didn't know how to do it efficiently and it took us a long time to figure it out. But in LabVIEW 4 we implemented a one-step undo and I was arguing for releasing that because I thought that would be a giant benefit. It's an infinite increase over zero step undo.

But other people on the team felt that it would be better to wait one more release and have a multi-step undo.
Their decision prevailed and we released a multi-step undo that has worked flawlessly ever since. Would we have been better introducing the one-step version earlier and solving part of the problem but not the whole problem or waiting until we had the multi-step? 

I don't know. We did it this way, it seems to have worked fine and so I don't know how to second guess that. There's been a lot of cases, a lot of decisions along the path where it's not clear what's best when we made a choice. Thinking about it again you might think, well maybe it would have worked a little better for me on another choice but it's not really clear that it would.

Send this to a friend