#Engineering #students have selected @elonmusk’s Space X and Tesla as the most attractive #employers in the #UnitedStates, according to @UniversumGlobal
Also, USEPA (18th) and DOE (23th) are ranked in the top 30. Energy and Environment are topics they wish to work on.
@SpaceX @Tesla @Google @Boeing @NASA @LockheedMartin @Apple @Microsoft @amazon @exxonmobil
We have to use for training ().
Training artificial intelligence is an energy intensive process. New estimates suggest that the carbon footprint of training a single AI is as much as 284 tonnes of carbon dioxide equivalent – five times the lifetime emissions of an average car.
Emma Strubell at the University of Massachusetts Amherst in the US and colleagues have assessed the energy consumption required to train four large neural networks, a type of AI used for processing language.
Language-processing AIs underpin the algorithms that power Google Translate as well as OpenAI’s GPT-2 text generator, which can convincingly pen fake news articles when given a few lines of text.
New Scientist Magazine issue 3234 , published 15 June 2019