“Either they leave or I leave” would be setting a boundary. Get some other people on your side in it.
In Memex crowd thinking environment for thoughts unthinkable to separate beings, human-machine general intelligence raises superintelligent offspring to help all life.
“Either they leave or I leave” would be setting a boundary. Get some other people on your side in it.
Which is exactly what they don’t want us to be able to discuss.
(This would allow harm taxes, fines, prisons, mind altering, and just war against those breaking this law. The best compromises can be found by parallel experimentation.)
(This would allow forestry, agriculture, and livestock breeding/genetic engineering, but not intensive animal farming or hunting. Only animals died from natural causes could be eaten. The “natural causes” would then be engineered to minimise suffering and to metastabilise the ecosystem wisely, possibly adding mercifully killing hunters to control animal populations, and in the case of “intelligent” beings failing to control their reproduction, chances for them to risk their own life to gain freedom from static storage or death, with optional mind transmit for the mostly harmless, hoping that someone somewhere runs them on a computer.)
From my much longer answer to https://www.quora.com/If-you-were-to-come-up-with-three-new-laws-of-robotics-what-would-they-be/answers/23692757
The lack of interpersonal conflict in Star Treks overseen by Gene Roddenberry is a good thing. Humanity got their shit together, made Earth paradise, and went exploring the galaxy and other frontiers in life. Shoehorning conflict and darkness into the newer series destroys what made it unique.
When I’ve decided something I bought a few months ago is perfect even after several washes, I try to buy another one, but that product is sold out and will never be seen again.