Zehaf-Bibeau carried a 30-30 Winchester shotgun and a knife during Wednesday's attack. Investigators believe he took the knife from his aunt's property in Mont Tremblant, where he lived years ago, Paulson said. The gun, which police described as "old and uncommon," also may have been hidden on the property, Paulson said.
Yeah, shotguns in .30-30 are pretty uncommon. As in, Do Not Exist.
I believe the word you're looking for there is "rifle."
Seems Elon Musk thinks AI is the greatest menace to humanity.
This leads to:
So we need to be very careful with the artificial intelligence. There should be some regulatory oversight maybe at the national and international level, just to make sure that we don’t do something very foolish."
Which, on a moment's contemplation, makes no sense.
See, if I create a rogue AI, what can it do? Send spam e-mails? Start a blog? Surf the net for robot porn? Dream of electric sheep?
You really think some unregulated hacker is going to doom mankind?
But... if the government creates an AI... and puts it to use... that's an entirely different kettle of fish.
So perhaps the correct solution is to prevent any government involvement in AI?
The classic "let's put it in charge of the nuclear arsenal" scenario isn't the only plausible big-government use for an infallible computer intelligence, to eliminate all possibility of human error. What else could be delegated to the Central Computer? Well, there's air traffic control. And economicplanning. And Justice. Scared yet?
Meanwhile, consider what private industry might do. Self-driving cars! Wow! A rogue self-driving car could do as much damage as... as... as a rogue human with a regular car. Which is right up there with a rogue (minimally-trained) human with all the smallarms he can carry.
Now, if a whole bunch of self-driving cars went rogue all at once, that'd be more troublesome. Like, a really big deal. But, unless there's a shared, calendar-based bug, or the cars are networked and conspiring against us...?
On the economic front: suppose we put an AI in charge of monetary policy? Think it could screw up any worse than the clowns we've got now? But, to inspire confidence, how about a well-defined, non-intelligent algorithm? It'd have to be a little more complex than "adjust the money supply such that the price of gold remains at $35 per troy ounce", but not necessarily a lot more. Ideally, it'd be simple enough for everyone to understand what drove the money supply. Perhaps elegance and transparency would prove more important, in the real world, than conceptual optimality?
How far our society has fallen! In these wicked days, word is making the rounds that police departments are warning of razor blades hidden in Halloween candy and/or apples!
Er, wait. Haven't those warnings been making the rounds since, like, the 1950s? Every year? And it always happened in Some Other City?
Why, yes! SnowPeas only traces it back to the mid-1960s (which is consistent with "as far back as I can remember"), though I'm pretty sure I read of it in some 1950s-vintage work of fiction.
And, indeed, as far as evil people actually distributing tampered goodies to the neighborhood kids: mostly mythical; the reported incidents seem to have been mainly hoaxes. And mainly inspired by the very warnings that were being tossed about.
The title of the post has nothing to do with Japanese-style fried eels.
I think I've mentioned a time or two that sometimes I'll have An Idea Whose Time Has Come, not have the time and resources to do anything with it, and then a couple of years later learn that someone at, e.g., IBM had the same idea at the same time.
(The cordless keyboard for the PCjr was one of these, though I was kinda-sorta ripping off Arthur C. Clarke if memory serves - taking a gimmick from a sci-fi novel and tweaking it just enough to be practical. Someone at IBM either read my mind or read the same novel, or maybe it was just an idea whose time had come.)
Anyway: there's a Thing that I've had on the drawing board for a couple of years now, even going through a couple of iterations of circuit design, but it's never quite seemed useful enough, nor producty-enough, for me to gets boards fabbed, install parts on them, and get busy with firmware development. (This could change soon, as there's potentially an impending need for such a Thing.)
Well, at RTECC this morning, I had a chat with one of the exhibitors, and their new-real-soon product is, from a hardware perspective, a Thing very much like mine. Difference is, they've got a team of firmware developers, and a business plan that involves selling the hardware with a baseline firmware load, and then selling firmware upgrades for special additional capabilities.
During the chat, though, I did come up with a potentially interesting business model, that basically involves changing an arm and a leg for the hardware and standard firmware (keeping everything closed-source), and then running a for-money on-line service whereby clients and their customers to sign in securely and enter function definitions to be compiled for the Thing (which is a sort of special-purpose test instrument, for a niche market, and typically used in acceptance testing).
Sorry, no specifics. Tinfoil time. Besides, I just came up with a further way of making the Thing look like it's an expensive piece of lab equipment, the better for Test Engineering to get it past Purchasing.
From an article about the Orion launch abort system*:
Should they need to abort, they'll be subjected to extreme forces. The launch abort motor provides 400,000 pounds of thrust, enough to accelerate the capsule from zero to 800 kilometers per hour in three seconds—and that's on top of whatever speed the launch vehicle is already going. The published numbers on how many Gs this produces varies, but it's somewhere north of 11.
Let's see, now.
Assuming constant acceleration, that's 800000 m/hour, divided by 3600 s/hour, equals 222.22 m/s added by the LAS (once it disconnects from the booster, the booster no longer contributes any acceleration and can safely left out of this particular calculation**). Divide by the 3 seconds it takes to reach that added speed: 74.07 m/s/s. Divide by 9.8 m/s/s: 7.56 G.
So where does the "somewhere north of 11" come from? Is the thrust wildly nonuniform? Is the "three seconds" actually much shorter? If the mass of the capsule were handily provided, dividing the thrust thereby ought to give a proper number for acceleration, but it's not mentioned in the article and I'm too lazy to go digging.
(Oh, if they're heading straight up during this process, add 1G, for a total of 8.56, obviously. Still well south of 11.)
* Which does not involve setting off a string of little nukes.
Ran into a young chap at the gas station who said he was in financial distress...
Brought to mind this post by Bayou Renaissance Man, and:
When I'd offer to buy the food/gas/clothes for them, and give them the merchandise rather than the money, in more than nine cases out of ten the offer was declined - in fact, more than a few of them actually insulted me, demanding cash instead. (Needless to say, they didn't get it.)
Well, this lad gave me the no-money-for-gas story... but (a) he was at a gas station, and (b) he was looking for someone to give money to the cashier so he could feed his car.
So I figured it really was gas he needed, and maybe he really was looking for work (he did come across as potentially employable). So I handed the cashier a few bucks for his pump, as well as the usual twenty for mine.
(And the car wasn't new and shiny. Looked like the sort of well-used vehicle a young job-seeker might be getting around in.)