

Summarize that sentence into a thumbs up or thumbs down emoji.


Summarize that sentence into a thumbs up or thumbs down emoji.


A single point of data rarely answers the question unless you’re looking for absolutes. “Will zipping 10 files individually be smaller than zipping them into a single file?” Sure, easy enough to do it once. Now, what kind of data are we talking about? How big, and how random, is the data in those files? Does it get better with more files, or is the a sweet spot where it’s better, but it’s worse if you use too few files, or too many? I don’t think you could test for those scenarios very quickly, and they all fall under the original question. OTOH, someone who has studied the subject could probably give you an answer easily enough in just a few minutes. Or he could have tried a web search and find the answer, which pretty much comes down to, “It depends which compression system you use.”


Pretty much everything you said is incorrect, except for the article age. Valetudo literally wrote software that does this on multiple models locally, including mapping. The response of the manufacturers whose models were capable of this was to release new versions where this wasn’t an option. As for servers and local control, there are a number of solutions for those with the knowledge and hardware to set it up, and the only thing stopping robovac companies from supporting this is (less) money.


We could still live in caves, but most of us have chosen not to. I’m personally of the opinion that every advancement that gives you more time to do things that are important to you are worth it. This doesn’t mean inviting every piece of spyware some company tries to thrust upon me is acceptable, either.


The Pebble Time 2 has a heart rate monitor. I can’t say if the rest of your statement is correct or not.
There was a story about a researcher using evolving algorithms to build more efficient systems on FPGAs. One of the weird shortcuts was some system that normally used a clock circuit, but none was available, and it made a dead-end circuit the would give a electric pulse when used, giving it a makeshift clock circuit. The big problem was that better efficiency often used quirks of the specific board, and his next step was to start testing the results on multiple FPGAs and using the overall fitness to get past that quirk/shortcut.
Pretty sure this was before 2010. Found a possible link from 2001.