Depends on specific cases, I have on good authority of how in few "bleeding edge" ones they essentially repacked/wrapped YOLOv3. Purpose was specifically tracking in adversarial conditions (smoke, including smokescreen, obstacles, etc)
For realtime on the edge the YOLO series is pretty good, I don't think anyone would disagree. Most of the really advanced stuff like Vision Language models all require a lot more compute and power budget.
In addition to the official reference to CMU, there is a second origin for the name.
SBCL - Sanely Bootstrappable Common Lisp
You see, when SBCL was forked from CMU, a major effort was done so that it could be compiled using any reasonably complete Common Lisp implementation, unlike CMU CL. Because CMU CL essentially could only be compiled by itself, preferably in the same version, which meant compiling and especially cross-compiling was complex process that involved bringing the internal state of CMUCL process to "new version".
SBCL redid the logic heavily into being able to host the core SBCL compiler parts in any mostly-complete (does not have to be complete!) ANSI CL implementation, then uses that to compile the complete form.
Meaning you can grab SBCL source tarball, plus one of GNU clisp, ECL, Clozure CL, even GNU Common Lisp at one point, or any of the commercial implementations, including of course CMUCL, and C compiler (for the thin runtime support) and build a complete and unproblematic SBCL release with few commands
As a former owner of a T470, Lenovo included a pretty beefy component from intel that was supposed to be feature complete by itself for dynamically managing thermals, including funky ideas like detecting if you were potentially using the laptop on your legs etc. and reducing thermals then, but giving full power when running plugged on the desk.
Time comes for delivery, Lenovo finds out that intel did a half-assed job (not the first time, compare Rapid Start "hibernation" driver earlier) and the result is kabylake T470 (and X270 which share most of the design) having broken thermals when running anything other than windows without special intel driver, thus leading to funny tools that run in a loop picking at an MSR in the CPU in a constant whack-a-mole with piece of code deep in firmware.
Unfortunately it started to be taken seriously, at least by academics who went on to infect an industry. I shit you not when I tell you the Software Project Management module I took at university described Agile as “Waterfall but done much faster” back in 2010/2011.
It's a lot like "GOTO considered harmful" where everyone knows the title changed by von Neumann but not the actual discussion (both by Dijkstra and the response by Knuth)
Late 1990s supposedly a considerable extension on use of Macs for DTP was that Quark could get significantly automated with AppleScript, and some publishing houses had non-trivial workflows done that way to reduce time spent on preparation.
TNG was using mainly practical effects early on, which had benefit of higher visual quality but meant not as much could be done within the budget.
Babylon 5 early and very enthusiastic use of CGI meant that the scope of what it could see was ridiculously bigger without reusing clips as much as some other places did.
DS9 literally used B5 bible information because Paramount had access to it after unsuccessful bid by JMS to make B5 there. Decision to make DS9 happened after JMS managed to get B5 signed up after Paramount's rejection.
That said, DS9 is its own thing, just with B5 "roots".
reply