In Teaching
Academic Honesty in CS50, a new paper scheduled to be presented at SIGCSE 2020,
educators behind Harvard's fabled
CS50 introductory computer science course report on an unexpected benefit of the
introduction of a "regret clause" that gives students
who cheat 72 hours to self-report their misdeeds in return for a lesser
punishment: It's made the CS50 team lose
less sleep over referring "unregretful" cheaters for harsh punishment.
From the SIGCSE paper: "Invocations of that [regret] clause have led to heart-to-heart
talks, referrals for mental health, and, ultimately, teachable moments for an otherwise
not-previously-reached demographic. But that same clause has also contributed to an
uptick in the number of cases referred to the university's honor council for
disciplinary action [which may result in required withdrawal from the university], in
part because we now feel more comfortable referring cases after students have had an
opportunity to take ownership themselves but have chosen not to do so." Bet you didn't
see that one coming, kids!
Astronomers Find 19 More Galaxies Missing Their Dark Matter (astronomy.com)
A leading research centre has called for new laws to restrict the use of
emotion-detecting
tech. The AI Now Institute says the field is "built on markedly shaky foundations."
Despite
this, systems are on sale to help vet job seekers, test criminal suspects for signs of
deception, and set insurance prices. It wants such software to be banned from use in
important decisions that affect people's lives and/or determine their access to
opportunities. The US-based body has found support in the
UK from the founder of a company
developing its own emotional-response technologies -- but it cautioned that any
restrictions
would need to be nuanced enough not to hamper all work being done in the area.
AI Now refers to the technology by its formal name, affect recognition, in its annual
report. It says the sector is undergoing a period of significant growth and could
already be
worth as much as $20 billion. "It claims to read, if you will, our inner-emotional
states by
interpreting the micro-expressions on our face, the tone of our voice or even the way
that
we walk," explained co-founder Prof Kate Crawford. "It's being used everywhere, from how
do
you hire the perfect employee through to assessing patient pain, through to tracking
which
students seem to be paying attention in class. "At the same time as these technologies are
being rolled out, large numbers of studies are showing that there is... no
substantial
evidence that people have this consistent relationship between the emotion that you are
feeling and the way that your face looks."
A Viasat spokesperson wouldn't comment on what prices and data caps will be applied to the company's FCC-subsidized plans. Viasat said it will provide the required 25Mbps service "along with an evolving usage allowance, and at FCC-defined prices, to certain areas, where we will be subject to a new range of federal and state regulations.
Apple Pulls App That Let You Turn Your Phone Into a Virtual iPod With Click Wheel (apple.com)
The blog says that Apple pulled the app because it copied the iPod's design, charged for Apple Music features, and people could mistake the app for an Apple product. The blog makes the case that the app had a pretty basic interface that looked nothing like an Apple app, and the iPod classic skins didn't come preinstalled. (You had to download them after you had already installed the app.) We've asked Apple for comment, and we'll update this story when we hear back. The Rewound blog says the iOS app can't be updated without "breaking the app for all 170,000+ users," but the developer, Louis Anslow, says he will attempt to bring the app back in some way. On a GoFundMe page for continued development of Rewound, Anslow says he will "try some tweaks to get Rewound resubmitted" on the App Store and that the GoFundMe will help support development of a web app and an Android app.
Microsoft said in a statement: "The ads within the app itself will be displayed regardless of which email address you use it with. It is not removable, but you can submit it as a suggestion within the Feedback Hub on Windows 10 here: https://msft.it/6012TVPXG."
How a Whale Crashed Bitcoin To Sub-$7,000 Overnight (newsbtc.com)
Bitcoin lost billions of dollars worth of valuation within a 30-minutes timeframe as a Chinese cryptocurrency scammer allegedly liquidated its steal via over-the-counter markets. The initial sell-off by PlusToken caused a domino effect, causing mass liquidations. PlusToken, a fraud scheme that duped investors of more than $2 billion, dumped huge bitcoin stockpiles from its anonymous accounts, according to Chainalysis. The New York-based blockchain consultancy cited an internal investigation that showed PlusToken scammers on a systematic crypto liquidation spree. Some of them have been actively selling bitcoin since June -- right after the cryptocurrency established a year-to-date high of circa $14,000.
According to Chainalysis, PlusToken had cashed out at least $185 million worth of bitcoin
via OTC desks. "We can say that those cashouts increased volatility in Bitcoin's price
and that they correlate significantly with Bitcoin price drops," says Chainalysis.
"Chainalysis's study shows that the entity still holds a massive stash of bitcoin that
it might liquidate at a later stage," adds NewsBTC. "That raises the prospects of more
price crashes unless there is an adequate demand to match the scammer's supply flow.