828cloud

Data, Info and News of Life and Economy

Daily Archives: May 2, 2023

Humour

Mapped: The State of Democracy Around the World

See large image . . . . . .

Source : Visual Capitalist

Is Saudi Arabia Selling Oil to China for Gold?

Jan Nieuwenhuijs wrote . . . . . . . . .

Rumors are making rounds that Saudi Arabia is selling oil for yuan, which it converts into gold on the Shanghai International Gold Exchange (SGEI). Such a development would make sense as large parts of the world want to de-dollarize, but the renminbi is not suitable to be used as a reserve currency. China has a closed capital account and a weak rule of law. Not using the dollar could be done by using the renminbi as a trade currency and converting yuan revenue into gold on the SGEI. If the rumor is true, Saudi Arabia is buying 1 Kg bars as there is virtually no trading in 12.5 Kg bars on the SGEI. The benefit of 1 Kg bars is that they are more fitting for fully allocated trading.

The SGEI was set up in 2014 for foreigners to access gold trading on the Shanghai Gold Exchange Main Board in the Chinese domestic gold market and trading on the International Board in the Shanghai Free Trade Zone (SFTZ). Foreigner entities can’t load-in and load-out gold into and from Main Board certified vaults, but they can load-in and load-out gold into and from International Board certified vaults (and thus import into and export from the SFTZ).

The objective of the International Board is to facilitate “offshore” gold trading in renminbi in the SFTZ, which has almost no effect on China’s current account. This is comparable to offshore gold trading in US dollars in London (offshore dollars pricing internationally traded commodities). Through the SGEI China wants to increase the role of the renminbi in the global economy.

Overview from a few years ago on domestic and foreign clients’ SGE(I) trading privileges (source: Spot Trading Rules of the Shanghai Gold Exchange). I can’t find an update of this overview on the SGE(I) website, but I don’t expect the essence has changed. As you can see, the SGEI is the International Board and the SGE is the Main Board. Both exchanges fall under the same umbrella.

Investment possibilities for foreigners in Chinese financial assets are limited, but there are no restrictions to converting yuan into gold on the SGEI. I will write more on the mechanics of the Chinese gold market in a forthcoming article because this will be important in the coming years with respect to de-dollarization.

Last week, Christopher Wood from Jefferies mentioned in a note that the Saudis might be converting yuan into gold on the SGEI.

Source: Chris Wood from Jefferies, Greed and Fear, April 2023. H/t VBL Gold Fix and @LukeGromen.

If Saudi Arabia would convert yuan into gold on the SGEI, I would expect them to buy large bars weighing 400 ounces (12.5 Kg). Data from the SGE and SGEI, though, reveals there has practically been no trading in 12.5 Kg bars since the SGE was erected in October 2002.


Weekly Shanghai Gold Exchange trading volume (12.5 Kg contracts)

In the above chart volume is shown for exchange trading of 12.5 Kg contracts. Not shown, over-the-counter (OTC) trading in the 12.5 Kg contract on the International Board was zero in the past year. OTC trading in the 12.5 kg contract on the Main Board isn’t reported, which makes me think it’s not existent. All in all, large bar trading on the SGE(I) is extremely low.

Based on 12.5 Kg contract trading volume on the SGE(I) it’s hard to prove the rumor is true, which doesn’t mean it can’t be true. Saudi Arabia can also buy 1 Kg bars on the SGEI (and SGE, but it wouldn’t be able to export gold traded on the Main Board). Trading in 1 Kg contracts on the SGEI (iAu9999) and SGE (Au99.99) is not subdued. Although, there hasn’t been a significant uptick in trading of iAu9999 recently.


Weekly Shanghai Gold Exchange trading volume (1 Kg contracts)

Interestingly, according to my sources, in China’s foreign exchange market (CFETS) all gold traded is settled and cleared through the SGE and is fully allocated. One of the reasons for this is because the underlying assets are the SGE 1 Kg (9999 fine) and 3 Kg (9995) contracts. In Western foreign exchange markets, the underlying for gold trading is usually the LBMA Good Delivery bar that weighs approximately 400 ounces, which is more convenient to trade on an unallocated basis. As an example, exchanging exactly 20,000 ounces in London is easy on an unallocated basis, while it’s difficult to collect a batch of large bars that together weigh precisely 20,000 ounces. Perhaps Asia is shifting to an alternative benchmark and that’s why the Saudis are buying 1 Kg bars? Time will tell.

Xi Jinping, President of the People’s Republic of China, visited Saudi Arabia in December 2022 where he pledged to continue buying oil and gas from Gulf Cooperation Council (GCC) nations, and proposed these trades to be settled in yuan. From Xi:

China will continue to import large quantities of crude oil from GCC countries, expand imports of liquefied natural gas, strengthen cooperation in upstream oil and gas development, engineering services, storage, transportation and refining, and make full use of the Shanghai Petroleum and National Gas Exchange as a platform to carry out yuan settlement of oil and gas trade…

Shortly after, in January 2023, Saudi Arabia shared it’s open to discussions about trade in currencies other than the US dollar, according to the kingdom’s finance minister.

The Wall Street Journal wrote in March 2023 that, “Saudi Arabia is in active talks with Beijing to price some of its oil sales to China in yuan.”

These statements tell us there is a will in both countries to de-dollarize. Selling oil for yuan and then converting those into gold would be a logical step, given the renminbi’s shortcomings as a reserve currency. But I would like to see more evidence before confirming this trend.

A few months ago, a person familiar with the matter but who prefers to stay anonymous told me Saudi Arabia is covertly buying gold, though he refrained from saying where it was bought. Perhaps the Saudis are slowly working on a transition; de-dollarization isn’t done overnight.


Source : Gainesville Coin

The Origins of Creativity

Louis Menand wrote . . . . . . . . .

What is “creative nonfiction,” exactly? Isn’t the term an oxymoron? Creative writers—playwrights, poets, novelists—are people who make stuff up. Which means that the basic definition of “nonfiction writer” is a writer who doesn’t make stuff up, or is not supposed to make stuff up. If nonfiction writers are “creative” in the sense that poets and novelists are creative, if what they write is partly make-believe, are they still writing nonfiction?

Biographers and historians sometimes adopt a narrative style intended to make their books read more like novels. Maybe that’s what people mean by “creative nonfiction”? Here are the opening sentences of a best-selling, Pulitzer Prize-winning biography of John Adams published a couple of decades ago:

In the cold, nearly colorless light of a New England winter, two men on horseback traveled the coast road below Boston, heading north. A foot or more of snow covered the landscape, the remnants of a Christmas storm that had blanketed Massachusetts from one end of the province to the other. Beneath the snow, after weeks of severe cold, the ground was frozen solid to a depth of two feet. Packed ice in the road, ruts as hard as iron, made the going hazardous, and the riders, mindful of the horses, kept at a walk.

This does read like a novel. Is it nonfiction? The only source the author cites for this paragraph verifies the statement “weeks of severe cold.” Presumably, the “Christmas storm” has a source, too, perhaps in newspapers of the time (1776). The rest—the light, the exact depth of frozen ground, the packed ice, the ruts, the riders’ mindfulness, the walking horses—seems to have been extrapolated in order to unfold a dramatic scene, evoke a mental picture. There is also the novelistic device of delaying the identification of the characters. It isn’t until the third paragraph that we learn that one of the horsemen is none other than John Adams! It’s all perfectly plausible, but much of it is imagined. Is being “creative” simply a license to embellish? Is there a point beyond which inference becomes fantasy?

One definition of “creative nonfiction,” often used to define the New Journalism of the nineteen-sixties and seventies, is “journalism that uses the techniques of fiction.” But the techniques of fiction are just the techniques of writing. You can use dialogue and a first-person voice and description and even speculation in a nonfiction work, and, as long as it’s all fact-based and not make-believe, it’s nonfiction.

The term “creative nonfiction” is actually a fairly recent coinage, postdating the advent of the New Journalism by about twenty years. The man credited with it is the writer Lee Gutkind. He seems to have first used “creative nonfiction,” in print, anyway, thirty years ago, though he thought that the term originated in the fellowship application form used by the National Endowment for the Arts. The word “creative,” he explained, refers to “the unique and subjective focus, concept, context and point of view in which the information is presented and defined, which may be partially obtained through the writer’s own voice, as in a personal essay.”

But, again, this seems to cover most writing, or at least most writing that holds our interest. It’s part of the author function: we attribute what we read not to some impersonal and omniscient agent but to the individual named on the title page or in the byline. This has little to do with whether the work is classified as fiction or nonfiction. Apart from “just the facts” newspaper journalism, where an authorial point of view is deliberately suppressed, any writing that has life has “unique and subjective focus, concept, context and point of view.”

Maybe Gutkind wasn’t naming a new kind of writing, though. Maybe he was giving a new name to an old kind of writing. Maybe he wanted people to understand that writing traditionally classified as nonfiction is, or can be, as “creative” as poems and stories. By “creative,” then, he didn’t mean “made up” or “imaginary.” He meant something like “fully human.” Where did that come from?

One answer is suggested by Samuel W. Franklin’s provocative new book, “The Cult of Creativity” (Chicago). Franklin thinks that “creativity” is a concept invented in Cold War America—that is, in the twenty or so years after 1945. Before that, he says, the term barely existed. “Create” and “creation,” of course, are old words (not to mention, as Franklin, oddly, does not, “Creator” and “Creation”). But “creativity,” as the name for a personal attribute or a mental faculty, is a recent phenomenon.

Like a lot of critics and historians, Franklin tends to rely on “Cold War” as an all-purpose descriptor of the period from 1945 to 1965, in the same way that “Victorian” is often used as an all-purpose descriptor of the period from 1837 to 1901. Both are terms with a load of ideological baggage that is never unpacked, and both carry the implication “We’re so much more enlightened now.” Happily, Franklin does not reduce everything to a single-factor Cold War explanation.

In Franklin’s account, creativity, the concept, popped up after the Second World War in two contexts. One was the field of psychology. Since the nineteenth century, when experimental psychology (meaning studies done with research subjects and typically in laboratory settings, rather than from an armchair) had its start, psychologists have been much given to measuring mental attributes.

For example, intelligence. Can we assign amounts or degrees of intelligence to individuals in the same way that we assign them heights and weights? One way of doing this, some people thought, was by measuring skull sizes, cranial capacity. There were also scientists who speculated about the role of genetics and heredity. By the early nineteen-hundreds, though, the preferred method was testing.

The standard I.Q. test, the Stanford-Binet, dates from 1916. Its aim was to measure “general intelligence,” what psychologists called the g factor, on the presumption that a person’s g was independent of circumstances, like class or level of education or pretty much any other nonmental thing. Your g factor, the theory goes, was something you were born with.

The SAT, which was introduced in 1926 but was not widely used in college admissions until after the Second World War, is essentially an I.Q. test. It’s supposed to pick out the smartest high-school students, regardless of their backgrounds, and thus serve as an engine of meritocracy. Whoever you are, the higher you score the farther up the ladder you get to move. Franklin says that, around 1950, psychologists realized that no one had done the same thing for creativity. There was no creativity I.Q. or SAT, no science of creativity or means of measuring it. So they set out to, well, create one.

They ran into difficulties almost immediately, and Franklin thinks that those difficulties have never gone away, that they are, in a sense, intrinsic to the concept of creativity itself. First of all, how do you peel away “creativity” from other markers of distinction, such as genius or imagination or originality or, for that matter, persistence? Are those simply aspects of a single creative faculty? Or can one score high on an originality or a persistence measure but low on creativity?

Then, do you study creativity by analyzing people commonly acknowledged to be creative—the canonical artist or composer or physicist—and figure out what they all have in common? Or could someone who has never actually created anything be creative, in the way that innately intelligent people can end up in unskilled jobs—the “born to blush unseen” syndrome? If that were the case, you would need an I.Q. test for creativity—call it a C.Q. test—to find such latent aptitudes.

But are all acts we call creative in fact commensurable? Is there some level on which the theory of relativity is no different from “Hamlet” or Pokémon? Psychologists said yes. Making something new, original, and surprising is what is meant by being creative, and a better mousetrap qualifies. What about creating something new, original, and terrible, like a weapon of mass destruction? Psychologists seem to have danced around that problem. For the most part, being creative, like being intelligent, rich, and thin, was something a person could never have too much of.

When psychologists asked what sort of habits and choices were markers of creativity, they came up with things like “divergent thinking” and “tolerance for ambiguity.” They reported that, on tests, creative people preferred abstract art and asymmetrical images. As Franklin points out, those preferences also happened to match up with the tastes of the mid-century educated classes. To put it a little more cynically, the tests seem to have been designed so that the right people passed them.

Franklin is understandably skeptical of the assumptions about mental faculties and inherent aptitudes made by the psychologists whose work he writes about. “By insisting on a psychological cause for creative accomplishment,” he says, “and bracketing all social factors, they deprived themselves of some of the most obvious explanations for creative accomplishment, trapping themselves instead in a tautological spiral that left them bewildered and frustrated.”

But, of course, this is also the problem with the SAT. In a meritocratic society, if creative accomplishment is, like intelligence, rewarded in the workplace, then it must be correlated with some inborn aptitude. Otherwise, we are just reproducing the existing social hierarchy. As Franklin observes, the creativity fad of the nineteen-fifties seems to have had zero impact on the privileged status of white males. The same is true of the SAT. It was not until colleges developed other methods of evaluating students with an eye toward increasing diversity, which generally meant giving less weight to standardized tests, that more dramatic effects on the demographics of higher education were seen.

Workplaces, including businesses and the military, were the other area where the concept of creativity shows up after 1945. Postwar organizations prized creativity. In Franklin’s account, these two streams, psychological research and business demands, arose semi-independently, but they obviously fed into and reinforced each other. Employers wanted creative workers; psychologists claimed they had the means to identify them. The former gave work to the latter.

Why the imperative to hire creative people? Franklin suggests that competition with the Soviets, spurred by anxiety about a technology gap, drove the country to search for better ways to get the most out of its human resources. You could argue that the women’s movement arose out of the same impulse. Forget about women’s dignity and right to self-fulfillment. It was just irrational, when you were fighting a Cold War, to exclude half the population from the labor force.

But American industry might have come up with other rubrics besides “creativity” to use in retooling the workforce. Probably the principal factor in the shift to creativity was not the Cold War but the transformation of the American economy from manufacturing to service (which includes financial services, health care, information, technology, and education). Franklin reports that, in 1956, the number of white-collar workers exceeded the number of blue-collar workers for the first time in American history. That is a huge shift on the production side, and it coincided with a huge shift on the demand side—consumerism. The postwar economy was the supermarket economy: products, many of which might be manufactured offshore, sit on the shelf, begging you to buy them. This meant that business had to conceive of its priorities in a new way.

In the old manufacturing economy, if you operated a factory using the techniques of “scientific management,” your workers were not required to think. They were required only to perform set tasks as efficiently as possible. In that kind of business, creativity just gets in the way. But, if your business is about sales, marketing, product design, innovation, or tweaks on standard products, you need ideas, which means that you want to hire the kind of people who can come up with them.

An early and persistent strategy for maximizing creativity in the workplace was known as “brainstorming.” Management set up sessions where workers got together and batted around ideas, on the theory that discussions held without an agenda or top-down guidance would encourage people to speculate freely, to think outside the box. The belief was that this was how creative people, like artists and poets, came up with new stuff. They needed to be liberated from organizational regimens. So workers played at being artists. Dress was informal; sessions were held in relaxed settings designed to look like living rooms; conversation was casual (though someone was taking notes). The idea was not to accomplish tasks. The idea was to, essentially, make stuff up.

Brainstorming would eventually morph into a process called Synectics. Synectics is a far more immersive and permissive form of problem-solving, closer to group therapy. The assumption there is that you want to access the subconscious. That’s where the really novel ideas are.

Franklin suggests that brainstorming and Synectics sessions produced lots of bad or unusable ideas, and no surprise. You can’t free-associate a design solution or a marketing strategy from scratch. You need to have a pretty informed idea of what the box is before you can think outside it.

But part of the point of this brainstorming must have been to enable workers to feel ownership of the product. They weren’t just punching a clock. They were contributing to the creation of something, even if it was something for which there was no crying need. Franklin tells us that Synectics can be credited with two products: Pringles and the Swiffer. I guess you can’t argue with that—though it’s interesting to learn that when you descend into the depths of the subconscious, you emerge with . . . a Pringle.

Franklin argues that the appeal of workplace creativity was that it addressed two anxieties about modern life: conformity and alienation. Postwar intellectuals worried about the “organization man” (the title of a book by the journalist William Whyte) and the “other-directed” personality (diagnosed in the sociologist David Riesman’s “The Lonely Crowd”). These were seen as socially dangerous types. People who did what they were told and who wanted to be like everyone else, who were not “inner-directed,” were people easily recruited to authoritarian movements. They were threats to liberal democracy, and hence to the free-market economy.

The branch of psychology most attuned to anxieties about alienation and conformity is known as humanistic psychology. For the humanistic psychologists, creativity is linked to the concept of authenticity. It is, at bottom, a means of self-expression. Uncreative people are rigid and repressed; creative people are authentically themselves, and therefore fully human. As the psychologist and popular author Rollo May put it, creativity is not an aberrant quality, or something associated with psychic unrest—the tormented-artist type. On the contrary, creativity is “the expression of normal people in the act of actualizing themselves.” It is associated with all good things: individualism, dignity, and humanity. And everyone has it. It just needs to be psychically unlocked.

You can see hints of the counterculture here, and humanistic psychology did lead, as Franklin notes, to encounter therapy, T-groups, and sensitivity training. What’s interesting, though, is that it was in American business, and not the Haight-Ashbury, that these ideals first became enshrined. Countercultural values turned out to be entirely compatible with consumer capitalism in the information age. “The postwar cult of creativity,” Franklin says, “was driven by a desire to impart on science, technology, and consumer culture some of the qualities widely seen to be possessed by artists, such as nonconformity, passion for work, and a humane, even moral sensibility, in addition to, of course, a penchant for the new.”

The industry that most avidly grabbed on to the term “creative” to glamorize what it did was the very motor of consumerism: advertising. In the nineteen-fifties and sixties, ad agencies abandoned the old “reason why” mode of advertising a product (“Here is what you need it for”) and replaced it with branding. They were no longer selling a product. They were selling an idea about the product. People were buying an image they wanted to be associated with. It was the adman’s job to create that image.

Creating an image for a marketing campaign or tweaking a product line seems pretty distant from writing a poem or painting a picture. But the creativity conceptualizers, Franklin says, sought to elide the difference. He thinks that management theorists wanted to appropriate the glamour and prestige of the artist and confer those attributes on admen and product designers.

Yet wasn’t the glamour and prestige of the artist related to a popular belief that artists are not interested in worldly things or practicality? Workplace creativity was supposed to be good for business. It was supposed to increase productivity and make money, things that are not supposed to motivate poets. Yet it’s easy to believe that business could co-opt the reputation of the fine arts without much trouble. The joy of creation plus a nice income. It was the best of both worlds.

Readers do not normally wish books longer, but a couple of discussions are missing from “The Cult of Creativity.” One is about art itself. The early Cold War was a dramatic period in cultural history, and claims about originality and creativity in the arts were continually being debated. Among the complaints about Pop art, when it bounded onto the scene, in 1962, was that the painters were just copying comic books and product labels, not creating. It’s possible that as commercial culture became more invested in the traditional attributes of fine art, fine art became less so.

One also wishes for more on the twenty-first century. Franklin says that the creativity bubble began to shrink in the nineteen-sixties, but it plainly got reinflated in the nineteen-nineties. The pages Franklin devotes to the contemporary creativity landscape are the freshest and most fun in the book.

The iconic image of the startup economy—casually dressed workers in open spaces jotting inspirations on a whiteboard—is a barely updated version of the old nineteen-fifties brainstorming sessions. Those startup workers are also taking ownership (usually in the form of stock options, it’s true) of the products the company makes.

The landscape of the tech universe is shifting right now, but for several decades a whole creativity life style became associated with it. Work was play and play was work. Coders dressed like bohemians. Business was transacted (online) in cafés, where once avant-gardists had sipped espresso and shared their poems. “The star of this new economy,” Franklin writes, was “the hip freelancer or independent studio artist, rather than the unionized musician or actor who had been at the heart of the cultural industries.” In his view, this is perfectly natural, since “creativity” was an economic, not aesthetic, notion to begin with. “The concept of creativity,” he concludes, “never actually existed outside of capitalism.”

Franklin doesn’t mention “creative nonfiction,” either. But his book does give us a way of understanding the term as an effort to endow nonfiction writers with the same qualities—individualism, outside-the-box thinking, and invention—that creative people are assumed to possess. “Creative nonfiction” in this respect doesn’t mean “made up.” It’s an honorific. In an economy that claims to prize creative workers, the nonfiction writer qualifies.

Creating things today seems to be as cool as it ever was. Fewer college students may be taking literature courses, but creative-writing courses are oversubscribed. And what do those students want to write? Creative nonfiction.


Source : The New Yorker

What’s Keeping China From Moving Up the Value Chain?

Wang Xing wrote . . . . . . . . .

Over the past few decades, China’s low labor costs helped transform it into the “world’s factory,” a center of industrial production for companies around the globe.

Now, with wages rising, the workforce shrinking, and the global supply chain undergoing a significant reshuffle, China faces an even stiffer challenge: upgrading its labor-intensive low-end manufacturing industries to better compete in the world of value-added manufacturing.

Most discussions of this upgrade tend to focus on the adoption of cutting-edge technologies like artificial intelligence and automation. But what about the workforce?

Fantasies of a post-labor industrial landscape aside, these new technologies will ask a lot of workers, and it’s increasingly apparent that China doesn’t have the pool of skilled staff necessary to move up the value chain. There are a number of reasons for this, starting with high turnover rates and unstable labor relations that prevent workers from effectively acquiring new skills. Life for many Chinese industrial workers is characterized by instability: They’re constantly switching jobs, companies, and even cities. A decade ago, manufacturers in the Pearl River Delta, one of China’s most prosperous manufacturing hubs, reported that as many as 20% to 30% of new employees leave within three months of their recruitment. More recent government data suggests turnover is rising.

High turnover rates make it harder for employees to hone their skills. Contributing to poor employee retention is China’s hukou household registration system, which effectively locks migrants out of public services in their city of residence. Housing is another major problem. While conducting fieldwork in a major Chinese city, we found that close to 60% of industrial workers are in a precarious living situation, characterized by poor housing, insufficient living space, or a lengthy commute. Housing prices have skyrocketed in Chinese cities over the past decade, while the urban villages that once offered affordable and accessible accommodation to migrant workers are being torn down and moved to the outskirts.

Precarious living circumstances often push workers into more flexible forms of employment. Deprived of stable housing and discriminated against by urban welfare systems, migrant workers have no way of putting down roots in the city, which in turn makes it hard for them to receive long-term technical training from a single employer.

For their part, manufacturers have embraced “flexible” employment systems such as seasonal employment, contract labor, and shared employee pools.

Although this helps companies cut down on labor costs, it disrupts industrial workers’ development. China is reliant on employers to train and accredit workers: In one of my past surveys of industrial workers, 67.1% of respondents said their technical training was provided by their employer, while only 19.9% received training at private institutions or vocational colleges. However, because employers assume skilled employees will be poached by their rivals, many small- and medium-sized enterprises are unwilling to invest in employee training. In our survey of companies, we discovered that 61% hadn’t established internal training programs.

Meanwhile, the government and employers cannot agree on a training model. Both companies and the government have their own certification systems, and each side refuses to recognize the other. This means that government-issued certificates — which tend to overemphasize cultural and theoretical knowledge — rarely translate into better wages for industrial workers, while the government is often skeptical of funding certificate programs created by companies or industry associations.

When industrial workers do acquire useful skills, it rarely translates into higher social status. Even skilled factory workers are at a huge disadvantage compared to their white-collar counterparts when it comes to wages, access to basic public services, and applying for a hukou.

Their career prospects are similarly limited. In manufacturing, there are only a few pathways for promotion open to the rank and file, and none of them lead very far. My research suggests that, although there is a significant demand among Chinese industrial workers for vocational training, the vast majority aren’t interested in studying skills applicable to their current job. Rather, they’d prefer to receive academic education or business training. This is perhaps a reflection of how underappreciated and dissatisfied they feel in their current positions.

When it comes to upgrading China’s industrial base, investing in technological progress such as automation and artificial intelligence isn’t enough. We must acknowledge that millions of industrial workers will still be needed to support Chinese manufacturing, and that their skill level will determine the success or failure of any effort to reform the national economy.

Facilitating workers’ acquisition of new skills will require the cooperation of workers themselves, as well as that of government departments, companies, and vocational colleges. And that starts with ensuring workers have equal access to social resources in their city of employment, regardless of their background, so that they can accumulate skills and meaningful professional experiences.

Second, training and certification standards should be consistent across companies and training institutions, with an emphasis on practical abilities rather than theoretical knowledge. Moreover, while companies and vocational colleges should be encouraged to collaborate on internship programs as a means of merging theory and practice, we must take care to prevent trainees from being exploited as a source of cheap labor.

Finally, in the interests of decreasing the cost of formal recruitment and curbing the shift to “flexible employment,” China must regulate unfair collusion between companies while incentivizing them to provide workers with higher and more equal wages.


Source : Sixth Tone

Study: ChatGPT Scores Nearly 50 per cent on Board Certification Practice Test for ophthalmology

A study of ChatGPT found the artificial intelligence tool answered less than half of the test questions correctly from a study resource commonly used by physicians when preparing for board certification in ophthalmology.

The study, published in JAMA Ophthalmology and led by St. Michael’s Hospital, a site of Unity Health Toronto, found ChatGPT correctly answered 46 per cent of questions when initially conducted in Jan. 2023. When researchers conducted the same test one month later, ChatGPT scored more than 10 per cent higher.

The potential of AI in medicine and exam preparation has garnered excitement since ChatGPT became publicly available in Nov. 2022. It’s also raising concern for the potential of incorrect information and cheating in academia. ChatGPT is free, available to anyone with an internet connection, and works in a conversational manner.

“ChatGPT may have an increasing role in medical education and clinical practice over time, however it is important to stress the responsible use of such AI systems,” said Dr. Rajeev H. Muni, principal investigator of the study and a researcher at the Li Ka Shing Knowledge Institute at St. Michael’s. “ChatGPT as used in this investigation did not answer sufficient multiple choice questions correctly for it to provide substantial assistance in preparing for board certification at this time.”

Researchers used a dataset of practice multiple choice questions from the free trial of OphthoQuestions, a common resource for board certification exam preparation. To ensure ChatGPT’s responses were not influenced by concurrent conversations, entries or conversations with ChatGPT were cleared prior to inputting each question and a new ChatGPT account was used. Questions that used images and videos were not included because ChatGPT only accepts text input.

Of 125 text-based multiple-choice questions, ChatGPT answered 58 (46 per cent) questions correctly when the study was first conducted in Jan. 2023. Researchers repeated the analysis on ChatGPT in Feb. 2023, and the performance improved to 58 per cent.

“ChatGPT is an artificial intelligence system that has tremendous promise in medical education. Though it provided incorrect answers to board certification questions in ophthalmology about half the time, we anticipate that ChatGPT’s body of knowledge will rapidly evolve,” said Dr. Marko Popovic, a co-author of the study and a resident physician in the Department of Ophthalmology and Vision Sciences at the University of Toronto.

ChatGPT closely matched how trainees answer questions, and selected the same multiple-choice response as the most common answer provided by ophthalmology trainees 44 per cent of the time. ChatGPT selected the multiple-choice response that was least popular among ophthalmology trainees 11 per cent of the time, second least popular 18 per cent of the time, and second most popular 22 per cent of the time.

“ChatGPT performed most accurately on general medicine questions, answering 79 per cent of them correctly. On the other hand, its accuracy was considerably lower on questions for ophthalmology subspecialties. For instance, the chatbot answered 20 per cent of questions correctly on oculoplastics and zero per cent correctly from the subspecialty of retina. The accuracy of ChatGPT will likely improve most in niche subspecialties in the future,” said Andrew Mihalache, lead author of the study and undergraduate student at Western University.


Source: Unity Health Toronto