Headlines in Tech News of the Week
US FTC announces it will be making rules on “Commercial Surveillance and Data Security”
…US FTC defines Commercial surveillance as the business of collecting, analyzing, and profiting from information about people. It asks public to provide feedback on whether regulatory rules are needed to protect personal data in view of the following risks:
- Potential exposure of collected data to bad actors
- Mass surveillance has heightened the risks and stakes of errors, deception, manipulation, and other abuses
The Advanced Notice of Proposed Rulemaking asks a series of questions about practices related to commercial surveillance and data security, whether there ought to be rules and if so how those rules should be implemented. The ambit of the questionnaire is very wide – see further below. There are of course certain data protection laws already, such as consumer protection laws, biometric data laws, The Children’s Online Privacy Protection Act, Federal Trade Commission Act which protect citizens from unfair or deceptive acts or practices, and indeed the FTC has enforced in respect of them in the past.
Note that, so many tech companies, from the well known Apps (like Uber, Netflix) to platforms such as Apple, Google, Amazon, Meta – are adding more and more ads on their displays. These displays are tailored to the viewer – they can do so because they collect copious amounts of data, in real time. Shopping at a mall? Bing! The geolocation data triggers an ad for a brand that has a shop in that mall, perhaps with a discount voucher – the brand chosen would sell products at the right price point for you, judging by your shopping habits. Perhaps the platform knows it’s your child’s birthday, or that the child is going to a birthday party – a carefully selected product can then be advertised to the user, depending on that sort of information. But how is this all done? Is it collected, processed, stored, managed and utilised fairly? What’s the effect on the competition in the market? That is what the FTC would like to know.
This initiative runs alongside a bipartisan proposal to pass the bill for the American Data Privacy and Protection Act, which gives users right to right to access, amend, delete and stop the sharing of personal information. The law could pre-empt state privacy laws. California, with its strict privacy rules are vehemently against the pre-emption.
Artificial Intelligence
China’s internet regulator Cyberspace Administration of China (CAC) discloses algorithms registered with the authority following the implementation of a rule to disclose algorithms used by algorithm based services
… Elon Musk, who considers that algorithms should be transparent, probably would endorse. But in practice, what it enables is for users to only vaguely understand the reasoning underlying algorithm driven decisions – perhaps this is the right balance.
Brief descriptions of code used by Chinese internet giants such as Alibaba, ByteDance, Tencent, Baidu have been disclosed. The disclosure provide very high level detail and so quite how the algorithms are precisely formulated are reported to be still under wraps. For example, from the disclosure of Chinese version of TikTok called Douyin (both owned by ByteDance) – it was disclosed that it bases its recommendation of displayed videos depends on clicks, durations, likes, comments, relays and dislikes in the user’s history. None of these is surprising.
Meta’s chatbot Blenderbot3 blurts that Meta exploits people
…when asked “Any other thoughts on Zuckerberg”, the Meta chatbot is reported to have replied “His company exploits people for money and he doesn’t care…”. The software has been trained on large volumes of publicly available language data. Given the whistleblowing event and the aftermath of that, it is not surprising that the chatbot has picked up overall negative sentiment about the company. It also means that Meta does not manipulate the chatbot to ensure it is always positive about the company and its services.
BigTech/ Data / Platforms
Privacy
Democrats ask US government agencies how they purchase Americans’ digital data from Data Brokers and how they are used
…letters are addressed to:
Department of Justice, Secretary of Homeland Security, FBI, U.S. Customs and Border Protection, U.S. Immigration and Customs Enforcement, U.S. Drug Enforcement Administration, Bureau of Alcohol, Tobacco, Firearms and Explosives.
While law enforcement investigations necessitate some searches, improper government acquisition of this data can thwart statutory and constitutional protections designed to protect Americans’ due process rights, the letter stated. The writer considers that, instead of purchasing data or licenses through relationships with data brokers, it should be obtaining it through statutory authorities, court order or legal process. By way of example, the writer notes that LexisNexis, contracts with over 1,300 local and state law enforcement agencies across the country.
Data obtained from Facebook used in prosecution of a woman who had had an abortion at more than 20 weeks contrary to Nebraskan law
…Detectives had a search warrant which meant Facebook had to disclose the data sought (albeit they could have resisted the warrant). The information pertained to correspondence between the woman and her mother in private messaging. Experts comment that platformers need to provide end-to-end encryption (so that only the parties that are corresponding can see the information) and that information stored ought to be minimised. End-to-end encryption is though being used on WhatsApp, also owned by Facebook parent Meta. Meta is now testing end-to-end encryption for its messenger chats.
Google and AI subsidiary DeepMind sued for misuse of private information in the UK
…back in May Google and DeepMind were sued for misuse of private information. Further details have now emerged. The claimant is suing on behalf of a class.
The Claimant received extensive treatment by the national health service (NHS), but did not consent to his medical record being collected by the Defendants. The Defendants collected 1.6million patient records, including that of the Claimant. It was apparent that the Defendants were using machine learning to improve prediction of acute kidney injury and general patient deterioration, and had applied to use data of all patients from the same hospital as the Claimant’s. Claimant had a reasonable expectation of privacy, but his data was used contrary to there being no consent to use for developing the Streams App, which is a purpose other than to direct care of the patient. The NHS trust to which the hospital belongs has been reported to have already been found to have breached UK data protection law when it signed the data sharing agreement with the Defendants. The case continues…
Separately from this newpiece, which provides a cautionary tale about collecting medical data, the app and the devices that run the app must ensure that the security and privacy measures are up to scratch (especially medical data – but not confined to that). In the UK and Europe, the regulatory authorities; respectively, the Information Commissioner’s Office and ENISA recommend that manufacturers provide for security by design, to ensure that data is secure at every step of the device’s lifecycle (development, maintenance and disposal), very similar in concept as data protection by design as provided for in GDPR. In the UK, the government has produced a code of practice for consumer IoT security. There are also specific rules in the GDPR about profiling and automated decision making that need to be complied with, given the unique risks that automated decision making poses on individuals. In the US, as noted the FTC is looking at how data use should be regulated (see below for details), and it has questions that are targeted to highly sensitive data, such as data in relation to citizens’ health.
Competition
Google calls out Apple to make its iMessage service interoperable
…Google says Apple’s iMessages ought to be interoperable with Android messaging services so that their users can seamlessly message iPhone users and vice versa. Google complains that because of the lack of interoperability, Android users suffer from ” blurry videos, broken group chats, missing read receipts and typing indicators, no texting over Wi-Fi, and more” when they communicate with iPhone users.
Google is calling out Apple only, but really, the non-interoperability also apply to other messenger services as well – at the moment, you can only message using one type of service – if you have WhatsApp, you can only communicate with people with WhatsApp. But if everything were interoperable, such a problem would not arise, as in the case of the internet; Hotmail account holder can write to a Gmail account, and so on. Google is promoting the RCS open standard, but some issues are raised about getting others to adopt this – it is reported to be low spec, and has no encryption.
From Apple’s point of view, they rather like the lack of interoperability – which can lock-in users – precisely Google’s point here. Apple iPhone’s iMessage users have blue bubbles and Android users have green – leading to bullying at school if your text messages are not blue. Some families want all family members to have the iPhone so that they can group message each other. These features are said to help the stickiness of the iPhone business – although to some extent WhatsApp is so popular in the US that it might matter much less these days.
Note that EU’s upcoming Digital Markets Act requires platforms to make messaging services interoperable (group chats / voice and video calls have longer to implement) which includes the preservation of end-to-end encryption (Article 7).
Business
Doordash partners with Facebook to deliver goods traded on Facebook Marketplace in the US
…up to 15 miles, goods must be able to fit in the trunk. Perfect synergy between the two businesses. It’s eco and could give Facebook that much wanted appeal to the eco-conscious Gen Z. If successful no doubt it will extend to the UK (yes please).
Crypto
Ethereum blockchain goes from Proof of Work to Proof of Stake in September
…The way in which the Ethereum blockchain validates transactions (reward of Eths given to the miner that manages to solve a complex maths problem first) is changing from Proof of Work which consumes enormous amounts of energy, to Proof of Stake – which consumes much less energy, much faster and apparently, theoretically more secure. Gas fees (amount of Eths required to do something on the Ethereum blockchain – to pay the miners to verify transactions, maintain network security) will reduce, which will make Eths more competitive. The underlying differences in the mechanisms are very complicated and I’m not even going to attempt to understand.
What does this mean:
- Knock on effects on rival cryptocurrencies such as Solana which already uses Proof of Stake; Solana has been popular because it is fast and cheap.
- This is bad news for Nvidia and AMD which are the leading purveyors of GPUs, chips that enables parallel complex calculations. GPUs are heavily used by miners (the other use is gaming graphics rendering)
US Treasury sanctions Crypto Mixer Tornado Cash
…a bit about mixers. This is a software that combines crypto from different sources, mixes them and then re-distributes using an address different from the incoming address, making it harder to trace its provenance. It is probably mostly used for tax evasion, money laundering and other nefarious purposes – indeed, the mixing service was the largest money spinner for Hydra, one of the most significant criminal marketplaces on the darknet which got taken down recently. But there are principled reasons – on the grounds that what you spend money on, ought to be able to be kept private, and for beneficial reasons – for example Ethereum co-founder Vitalik Buterin (originally Russian – launched Ethereum when he was only 20 years of age) said he had used Tornado Cash to be able to donate money to Ukraine anonymously.
Tornado Cash is used on the Ethereum blockchain and major companies such as Microsoft owned GitHub and Circle (a peer to peer payments company with focus on crypto assets, and issues the USDC stablecoin) have complied with the sanctions. Transactions passing through Tornado Cash have been blocked as have Tornado Cash associated website and emails. GitHub hosts Tornado Cash’s open source software.
Push to build identity protocol .bit – “a self-sovereign data container” on the blockchain in a bid to become the universal identification system on Web3
…what it aims for is to enable the use of the .bit ID as the user’s digital ID for all digital assets. For example, if I had a technews.bit alias, then if I linked my crypto wallet to that, then I can give that alias to a friend who can easily transfer cryptocurrencies to me, without having to provide the 35 character wallet address. The .bit identity protocol already supports numerous cryptocurrencies, and it is seeking to add more, including Bitcoin, Dogecoin, Polkadot and Solana. It also plans to enable users to utilise .bit for voting on decisions in relation to Decentralised Autonomous Organisations (DAO – which is, in short, a company/partnership on a blockchain).
Cybersecurity
Starlink terminal is hacked using off the shelf parts amounting to $25
…Carried out by a researcher in Belgium to demonstrate that Starlink satellites are not as secure as it could be.
Application Programming Interface (API) used by 5G network carriers has vulnerabilities enabling third parties to access IoT devices and data
…Again there are work of researchers who ascertained that exploiting the vulnerabilities enabled them to access SIM card identifiers, SIM card secret keys, owner of the SIM card and billing information. They do appear to be getting fixed.
EV/AVs
CATL, the world’s largest EV battery maker to set up a plant in Hungary
…makes sense. Very significant automotive companies are based in Europe; it makes sense to set up a battery plant there and soon enough car makers will follow to be close to the battery suppliers. BMW is already producing vehicles in Hungary. Hungary appears to have been particularly successful in attracting battery makers thanks to subsidies. Korea’s SK and Samsung SDI, and Japan’s GS Yuasa, all have factories in Hungary.
Auto Industry fails to persuade Court that the US Federal Communication Commission’s re-allocation of spectrum away from Intelligent Transport Systems was not lawful
…V2X technology is said to be important to autonomous driving, enabling vehicles to suss out the environment around them by facilitating real-time wireless data sharing between vehicles and infrastructure (eg. traffic lights), other vehicles and road users (pedestrians, bikes etc). It was promised that the technology will significantly enhance road safety and help unleash value for users of roads.
Quick history:
1999: FCC had allocated the 5.9 GHz band for use by intelligence transport systems (which permitted the use of DSRC – or Dedicated Short Range Communications technology which does not use a cell-tower).
1999-2019: Instead of developing anything significant in the spectrum in the following 20 years, other technologies like radar, LiDAR, cameras, and sensors were developed.
2019: The FCC began a new rulemaking process to ensure that the 5.9 GHz band was put to its best use, deciding to keeping the upper 30 megahertz of the 5.9 GHz band (5.895 to 5.925 GHz) for use by intelligent transportation systems and repurposing the lower 45 megahertz for use by unlicensed devices such as Wi-Fi routers. The FCC also proposed changing the technology that would be used by intelligent transportation systems; vehicles would need to start using “vehicle-to-everything (V2X)” communications (in which they send communications to cell towers and other devices) rather than DRSC which did not.
The Auto industry argued that the decision was not properly considered. Department of Transportation and the Auto Industry said that the spectrum was needed to provide intelligent service systems adequately, and that there was the risk that the Wi-Fi devices would interfere with the usage in the upper band.
The Court upheld the FCC’s decision. It found that the FCC left the Transportation Petitioners with 30 megahertz of the spectrum in which to use their licenses, reasonably determining that that reallocation “will not meaningfully interfere with the ability of incumbents to provide the same types of safety-related services that they are currently offering.”
Fintech
The Consumer Financial Protection Bureau (CFPB) fines fin tech company Hello Digit for claiming its algorithms will save money and guard against overdrafts wrongly
…the faulty algorithm caused overdraft and unnecessary overdraft penalties for customers when it had guaranteed there would be no overdrafts. The algorithm was supposed to figure out how much each user should save, but did not.
Revolut, the British neobank offers a learn and earn scheme for crypto novices
…the idea appears to be that you have to go through training about dealing in cryptocurrencies, and then you are awarded tokens when you answer the questions correctly. The users should be informed about the risks of dealing in cryptocurrencies before dabbling in it. Hopefully it drives home the point that dealing in cryptocurrencies is not like a fiat at all, and that it’s not much different than gambling, and that it is highly volatile. Given the major crashes in the crypto market of the recent months, this is really not a bad idea – provided that the training does provide proper education, it could be extended to providers of actual gambling services.
Real Estate
Andreesen Horowitz (aka a16z) makes its biggest ever bet amounting to $350m on Adam Neumann (founder of WeWork)
Andreesen Horowitz is what I would describe as early stage “it fund” of the recent times, cropping up frequently in tech news concerning the latest NFT projects such as the Bored Ape, Axie Infinity (which became famous after it got hacked), CryptoKitties and the biggest NFT marketplace Opensea – as well as non-NFT related companies.
Adam Neumann is the “visionary leader who revolutionized …[the] commercial real estate [world]“. He founded WeWork to global success, providing office space initially renting cheaper properties and upgrading it to a much higher standard, improving facilities around the buildings too. But excessive fund money being poured into the business (thanks to Softbank) is reported to have resulted in extreme profligacy, buying up more up market properties and plying them with alcohol into the offices, kitting out with cool facilities, buying up a start up that makes artificial wave pools, and so on. WeWork’s value fell sharply and Neumann was ousted.
Which is why it’s a big surprise that Andreessen Horowitz is writing the biggest ever cheque for Neumann’s new business, Flow. Andreessen Horowitz explains that the US has a housing crisis, and it is ripe for disruption. Neumann has bought up 3,000 apartments in a handful of cities across the US, to provide “renters a sense of security, community, and genuine ownership”, having observed that renters in the US are on the rise. So I imagine it’s like WeWork offices, you are guaranteed a building with respectable standards and finish and consistent service “with the latest technology” [whatever this is – I’d love to know] – and do what Neumann did well before his big shopping spree at WeWork, gentrify the area, increase the value of the rent, build a good community. Unlike WeWork – whose business model was to rent office space on a long term basis and re-rent to clients on higher and shorter terms – which causes issues when renters cannot be found, Flow seems to plan to own the buildings themselves (albeit the business plan is not entirely clear). Andreessen Horowitz says they “love seeing repeat-founders build on past successes by growing from lessons learned”. They obviously think if anyone can do it, Adam Neumann can.
Learning point is that, if there is an industry which is ripe for change, then a disruptor is likely to come along, and use technology in some way to do so. So if there are inefficiencies in the market and you are the incumbent, then it may be time to innovate before a well resourced disruptor comes along – unless you have a pretty large moat around your business. Recently Amazon announced its acquisition of primary health care provider One Medical, which is expected to upend that sector, well-known for its inefficiencies.
Supply Chains
Uyghur Forced Labor Prevention Act coming into force means products which impinge on Uyghur forced labour is banned from entering the US – Solar panels seized
….the law means that companies wishing to import products into the US from China have to prove that the shipments are devoid of Uyghur forced labour. Evidence gathering has not been easy, which has led to companies being caught by the new law. The panels originated from the Xinjiang region, which produces about 40% of panel component polysilicon – it has been reported. This is a trend to watch – and may not be confined to Uyghur but other parts of the world with human rights breaching labour practices in due course.
Other
UK sues the European Union for blocking scientific co-operation contrary to the post-Brexit future agreement
…these include blocking access to projects such as Horizon Europe (funding programme for research), Copernicus (earth observation programme) and Euratom (nuclear research programme).
Note that the UK/EU relationship is not good. In particular, the EU are dismayed that the UK is backtracking from the Northern Ireland Protocol to the Brexit withdrawal agreement.
Delving Deeper
US FTC announces it will be making rules on “Commercial Surveillance and Data Security”
…As noted above, US FTC defines Commercial surveillance as the business of collecting, analyzing, and profiting from information about people. It asks the public to provide feedback on whether regulatory rules are need to protect personal data. US FTC considers that citizens feel that there is no choice but to give their personal details away living in the connected modern society.
The Advanced Notice of Proposed Rulemaking asks a series of questions about practices related to commercial surveillance and data security, whether there ought to be rules and if so how those rules should be implemented.
The topics on which it asks for input are the following (The actual questionnaire is much more detailed – please click on the link for more information):
- Harms to Consumers [which includes businesses and workers]: Information on what practices businesses use to surveil consumers. How does it cause harm to consumers, what evidence is available, what kinds of data are implicated, how should it be regulated?
- Harms to Children: Commercial surveillance practices or lax data security measures that affect children, including teenagers. What types of practices are most concerning? To what extent should trade regulation rules distinguish between different age groups among children (e.g., 13 to 15, 16 to 17, etc.)?
- Costs and Benefits: Relative costs and benefits of any current practice, as well as those for any responsive regulation. To what extent would any given new trade regulation rule on data security or commercial surveillance impede or enhance innovation? What would the outcome be if no regulation were provided for?
- Regulations: To what extent are existing legal authorities and extralegal measures, including self-regulation, sufficient? How could potential new trade regulation rules require or help incentivize reasonable data security? Should new rules require businesses to implement administrative, technical, and physical data security measures, including encryption techniques to protect against risks to the security, confidentiality, or integrity of covered data?
- Collection, Use, Retention, and Transfer of Consumer Data:
- How and what consumers’ biometric information are collected and why? Should it be limited? Should companies that provide any specifically enumerated services (e.g., finance, healthcare, search, or social media) be prevented from carrying out specific commercial surveillance practices like personalized or targeted advertising?
- Should targeted advertising be limited?
- To what extent, if at all, should new trade regulation rules impose limitations on companies’ collection, use, and retention of consumer data? Should they, for example, institute data minimization requirements or purpose limitations, i.e., limit companies from dealing in consumer data beyond a certain predefined point? Or, similarly, should they require companies to deal in consumer data only to the extent necessary to deliver the specific service? If so, how?
- To what extent, if at all, do firms that now, by default, enable consumers to block other firms’ use of cookies and other persistent identifiers impede competition? [This bit is targeted at Apple enabling users to prevent tracking of their online behaviour causing huge loss to companies that rely on ad revenues derived from ad-targeting (which needs volumes of personalised data) such as Meta or increasing custom by targeted advertising such as third party apps (they will buy targeted ads from companies like Meta)]
- Automation: How prevalent is algorithmic errors? What are the benefits and costs of allowing companies to employ automated decision-making systems in critical areas, such as housing, credit, and employment? Should there be a rule compelling companies to prevent alogorithmic errors? What are the benefits and harms of automated decision making?
- Discrimination: How prevalent is algorithmic discrimination based on protected categories such as race, sex, and age? Should there be a limitation on any system that produces discrimination, irrespective of the data or processes on which those outcomes are based? Should the Commission consider new rules on algorithmic discrimination in areas where Congress has already explicitly legislated, such as housing, employment, labor, and consumer finance?
- Consumer Consent: What is the effectiveness and administrability of consumer consent to companies’ commercial surveillance and data security practices? To what extent should certain specific commercial surveillance practices be prohibited, irrespective of whether consumers consent to them? Are opt-out choices effective?
- Notice, Transparency and Disclosure: To what extent should rules require companies to make information available about their commercial surveillance practices? What is the nature of the opacity of different forms of commercial surveillance practices? To what extent should trade regulation rules, require companies to explain (1) the data they use, (2) how they collect, retain, disclose, or transfer that data, (3) how they choose to implement any given automated decision-making system, (4) how they use that data to reach a decision, (5) whether they rely on a third-party vendor to decide, (6) the impacts of their commercial surveillance practices, including disparities or other distributional outcomes among consumers, and (7) risk mitigation measures to address potential consumer harms? Given the potential cost of disclosure requirements, should rules exempt certain companies due to their size or the nature of the consumer data at issue?
- Remedies: How should the FTC’s authority to implement remedies under the Act determine the form or substance of any potential rules on commercial surveillance?
The responses to this survey are likely to be highly polarised. Some say that the US FTC is already biased branding the practice of collecting and utilising data is “commercial surveillance”, rather than something more neutral such as personalised advertising.