Rich, (01)
First a correction. In the following I meant to say 14 gigabytes,
not 14 terabytes. It actually did a significant amount of processing.
> As for performance, the space overhead for one agent is about 1K bytes
> (plus whatever space the application uses). In one test, the FMF used
> thousands of agents to process 14 gigabytes of data on a machine with
> 8 CPUs in 15 minutes. When 7 CPUs were turned off, it took almost
> 2 hours. That's almost linear scaling. (The data was provided by the
> client, and the VivoMind system generated better results in less time
> than the previous software that the client had been using.) (02)
JFS
> However, most AI systems for NLP and informal reasoning make do with
> any data they get. (03)
RC
> Agreed - there are many data wrangling tools. Embedding one or more
> dbmss makes great sense also, even very local dbmss that are only
> shared among a LAN's users for security reasons. (04)
Yes, our clients are professionals who know how to handle large volumes
of data with all the required locking, etc. What they need are methods
for processing and interpreting large volumes. They don't need or want
the system to update the old data. (05)
John (06)
_________________________________________________________________
Message Archives: http://ontolog.cim3.net/forum/ontolog-forum/
Config Subscr: http://ontolog.cim3.net/mailman/listinfo/ontolog-forum/
Unsubscribe: mailto:ontolog-forum-leave@xxxxxxxxxxxxxxxx
Shared Files: http://ontolog.cim3.net/file/
Community Wiki: http://ontolog.cim3.net/wiki/
To join: http://ontolog.cim3.net/cgi-bin/wiki.pl?WikiHomePage#nid1J (07)
|