David, Phil, and John B, (01)
I suggest a very simple definition for Big Data: (02)
Data whose size N (in bytes) is so large that any algorithm that
takes time that is polynomial in N (for any exponent greater than 1)
is prohibitively expensive with existing hardware. (03)
This definition scales with the technology. It was true in 1960, when
people did research on sorting algorithms that took (N log N) time. (04)
The computers today are a million times bigger and faster than in 1960,
but BIg Data today cannot be processed by any polynomial algorithm
(with an exponent greater than 1). (05)
And that definition will still be true when computers are a million
times bigger and faster than today's. (06)
John (07)
_________________________________________________________________
Message Archives: http://ontolog.cim3.net/forum/ontolog-forum/
Config Subscr: http://ontolog.cim3.net/mailman/listinfo/ontolog-forum/
Unsubscribe: mailto:ontolog-forum-leave@xxxxxxxxxxxxxxxx
Shared Files: http://ontolog.cim3.net/file/
Community Wiki: http://ontolog.cim3.net/wiki/
To join: http://ontolog.cim3.net/cgi-bin/wiki.pl?WikiHomePage#nid1J (08)
|