In the 1990s, I spoke with a manager of a research department at
a company that had spent a few million dollars in supporting the
Cyc project. They had access to all the Cyc software at the time,
they had sent some of their employees to attend the Cyc courses
at Austin, and they had several of their employees work on projects
that used Cyc. When I asked him about their experiences, he replied:
Anonymous manager:
Over the years, several people in our department have spent time
working with Cyc. And a funny thing is that every one of them has
been fired. And I don't believe that's a coincidence.
I knew some of the people who worked with Cyc at that company.
They had advanced education in computer science and AI. They also
were highly motivated to find some practical applications of Cyc.
And their management gave them the resources. But they failed.
Since 2004, the research funding at Cyc was sharply cut back, and
they had to shift their emphasis to making money by implementing
practical systems for clients that had jobs that needed to be done.
I would like to know (1) why Cyc in its earlier form did not lead
to practical applications, (2) what is that "disconnect" between Cyc
and mainstream IT, and (3) how have they changed their methodologies
and interfaces in order to implement practical applications.
As I said, one talk by somebody from Cyc would be useful. But even
more useful would be some talks by people who used Cyc or worked
with Cyc in the past and who learned (the hard way) what works
and what doesn't work when trying to build *large systems* and
to build interfaces between *large systems* and AI technology
of any kind.
John