Talk:OpenAI/Archive 1
This is an archive of past discussions about OpenAI. Do not edit the contents of this page. If you wish to start a new discussion or revive an old one, please do so on the current talk page. |
Archive 1 | Archive 2 |
The following discussion is closed. Please do not modify it. Subsequent comments should be made on the appropriate discussion page. No further edits should be made to this discussion.
Per WP:NPRODUCT, For product lines that are produced and/or marketed by the same company, avoid creating multiple stubs about each individual product (e.g., PU-36 Explosive Space Modulator, Q-36 Explosive Space Modulator, R-36 Explosive Space Modulator, etc.) especially if there is no realistic hope of expansion. The relationship between a continuous line of products should be discussed within a single article.
The general scope of GPT (language model) is currently covered within the OpenAI article, which seems like an apt location to merge this to until such a time where we have something beyond routine business announcements that cover this. — Red-tailed hawk (nest) 04:21, 4 February 2023 (UTC)
- Perhaps GPT-4 (and 5, 6, 7, etc.) article should be merged into ChatGPT rather than the main OpenAI article?
Alternatively, perhaps both should be merged --> OpenAI (unless this results in something too voluminous and unwieldly for mere humans to process)?Cheers, Cl3phact0 (talk) 11:34, 6 February 2023 (UTC) - PS: Perhaps we should ask it what it thinks?
- PPS: The same logic may also apply to the GPT-3 article.
- Oppose merge of ChatGPT and GPT-3 into OpenAI. They deserve their own articles as they are major products and widely covered by reliable sources on their own. Support redirect of GPT-4 to OpenAI until GPT-4 is released and more information becomes available. --Ita140188 (talk) 12:14, 7 February 2023 (UTC)
- For clarity, my suggestion (in reply to the original proposed merge) is to merge all GPT versions (3, 4, n) into a single main ChatGPT or GPT article (not OpenAI). Perhaps my alternative suggestion made this unclear (I will strike it out).
- I agree with your opposition to merging them into OpenAI article, and also think that the Generative models sub-section could be shortened by moving any detailed information to a main GPT article (per above). Cheers, Cl3phact0 (talk) 12:32, 7 February 2023 (UTC)
- Oppose merge of ChatGPT and GPT-3 into OpenAI. They deserve their own articles as they are major products and widely covered by reliable sources on their own. Support redirect of GPT-4 to OpenAI until GPT-4 is released and more information becomes available. --Ita140188 (talk) 12:14, 7 February 2023 (UTC)
- I agree with your opposition. Keep GPT-2, GPT-3, and ChatGPT as separate articles, but merge GPT-4 into the OpenAI or GPT-3 article for now. Yannn11 15:46, 7 February 2023 (UTC)
- Merge, but not to OpenAI but to main GPT article. I'll also support merge of all GPT-2/3/4/n to one article, probably Generative pre-trained transformer. Artem.G (talk) 20:10, 7 February 2023 (UTC)
- Thanks for making it succinct. Agree. Cl3phact0 (talk) 22:54, 7 February 2023 (UTC)
- Oppose, --- Tbf69 userpage • usertalk 19:56, 8 February 2023 (UTC)
- Oppose We shouldn't merge them, there is substantial media attention on both. Mixed Biscuit (talk) 23:09, 18 February 2023 (UTC)
- Keep it as is. We have a whole 2,000 word article on Apple's electric car project, slated for release in 2026. I think GPT-4 is notable enough that we can afford a separate article, and avoid the confusion of picking an appropriate redirect target.
- NPRODUCT is a weak argument, since there is obviously "realistic hope for expansion"; it's more pertinent for almost-identical products like 1990s Macintoshes or whatever the heck "R-36 Explosive Space Modulators" are, but not for this. CRYSTAL would be a better argument, but I think we can (and should) make an exception in this case, since no merge target would provide valuable context to this article (MERGEREASON #5). I argue it fails MERGEREASON #4 as well.
- Also tentatively oppose merging all GPT-X articles into Generative pre-trained transformer: This'll sound weird, but I've noticed the WP:RS "narrative" regarding LLMs is pretty "mercurial", as in: they give a pretty fascinating mirror into how societal views keep shifting regarding the dangers & opportunities of LLMs. See for example the difference in the "mood" of coverage between LaMDA (not OpenAI) and ChatGPT, or even between GPT-3 and ChatGPT. I worry that if we merge them all, we might lose those nuances. The Reception sections are the kind of curiosity-catnip you usually only find in Featured articles, and a merge would risk losing them.DFlhb (talk) 22:32, 8 February 2023 (UTC)
- Strong Support for merge per WP:CRYSTAL: "short articles that consist of only product announcement information and rumors are not appropriate". We can mention GPT4 briefly in both OpenAI and Generative Pre-trained Transformer --TocMan (talk) 03:05, 21 February 2023 (UTC)
- Alternative merge as proposed by @Artem.G. This GPT article is much better for all things GPT except for extremely notable stuff like ChatGPT. InvadingInvader (userpage, talk) 23:27, 21 February 2023 (UTC)
- What makes ChatGPT extremely notable as opposed to GPT-2 or GPT-3? Yannn11 15:26, 27 February 2023 (UTC)
- Merge to GPT-3. GPT-4 is the successor to it, but it doesn't have enough notability demonstrated for its own distinct article. SWinxy (talk) 01:52, 2 March 2023 (UTC)
- Strong support merge because it's simply not notable enough yet. There is literally no information from OpenAI about it. I do believe it will eventually warrant an article, but WP:NOTJUSTYET. – Popo Dameron talk 05:26, 10 March 2023 (UTC)
- Comment For the record, Microsoft announced yesterday that GPT-4 would come next week. Pointless to merge when we'll need to unmerge almost immediately. DFlhb (talk) 05:32, 10 March 2023 (UTC)
- Oppose per Dflhb. GPT-4 will 100% be notable after it comes out in under a week. There's unconfirmed rumors that Bing is using it. Snowmanonahoe (talk) 20:52, 10 March 2023 (UTC)
- Oppose as per DFlhb and Ita140188 . Don’t Get Hope And Give Up — Preceding undated comment added 11:01, 11 March 2023 (UTC)
- Oppose It should be notable enough in a few days. — Omegatron (talk) 18:24, 13 March 2023 (UTC)
As it is released today, this discussion has no more sense, so I'm closing it and removing merge templates from both articles. Artem.G (talk) 17:35, 14 March 2023 (UTC)
New Headquarters
seems to be based in sfbay area.Mercurywoodrose (talk) 06:10, 12 December 2015 (UTC)
- Done. Thanks, Gap9551 (talk) 17:49, 12 December 2015 (UTC)
- thanks i couldnt find a ref, you did.Mercurywoodrose (talk) 19:22, 12 December 2015 (UTC)
OpenAI recently moved headquarters and are no longer in the Pioneer Building. Their new office is nearby, although not yet public. R3SPACE7 (talk) 20:23, 19 May 2023 (UTC)
See also itis
many articles like this have too many "see also"s. we shouldnt just place every related article here. it should be lists that include this article, and article directly related that have not been able to fit well into the actual article. other institutes should NOT be listed, but should be in the body of the article as reliable sources themselves make the link or connection. its more a style point, too many see alsos means we are doing the research for the reader on whats interesting to them. we could put a see also for "luddites" for people who read this and say "hell no i hate this", or a link to brain development articles, history of computing, other think tanks in the bay area, cool AI projects like Watson, the Singularity, roger penrose who says we cant develop AI, and the movie AI. the list goes on and on.Mercurywoodrose (talk) 19:30, 12 December 2015 (UTC)
- I agree. I added just one originally, about the topic Existential risk from advanced artificial intelligence that closely matches the aims of the company. There are several institutes with similar goals, but they can also be found in Category:Existential risk organizations. I removed Allen Institute for Artificial Intelligence for starters, as they seem not be specifically aiming to reduce risks associated with AI, just to develop advanced AI in general. Gap9551 (talk) 00:47, 13 December 2015 (UTC)
the field and the company
This article is supposed to be about the company, not AIin general, but I see in the article a great deal of general discussion about the future prospects for AI. It doesn't belong here. DGG ( talk ) 22:33, 3 February 2016 (UTC)
- DGG I originally added the content because the mainstream media coverage of OpenAI talks in great detail about the donors' motivating beliefs about the future prospects for AI. I know you're a busy admin; maybe you didn't have time to read the sources? Do you want to discuss this and the "promotional" tag some more, or would it satisfy your concerns if I just ask WP:THIRDOPINION for an opinion to avoid taking up too much of your time? (Of course, if anyone else on this page shares DGG's concerns and would like to elaborate on possible concerns, feel free to chime in.) Rolf H Nelson (talk) 20:34, 6 February 2016 (UTC)
Re: GPT3 "Pre-training GPT-3 required several thousand petaflop/s-days of compute, compared to tens of petaflop/s-days for the full GPT-2 model."
The following discussion is closed. Please do not modify it. Subsequent comments should be made on the appropriate discussion page. No further edits should be made to this discussion.
A while ago I put up a tag saying copy edit was needed, and it was reverted with a summary stating "[t]his is a correctly used technical term". I've never seen the term petaflop be used in that particular way before.
These are the two glaring typographical irregularities that have gotten me stumped:
- petaflop/s-days: I assume this was supposed to mean either petaflops/day or petaflop-days, but both nouns are in their plural forms.
- I have no idea what a petaflop-day is supposed to be.
- petaflops/day means billions of operations per second per day, which would suggest that pre-training either GPT would require a computer to perform a certain amount of PFLOPS on one day, and more PFLOPS than that on the next day, and so on.
- "of compute"; I can't decide if it should be corrected to "of computation" or "to compute", so I've left that part as-is.
-- MrPersonHumanGuy (talk) 02:04, 30 August 2020 (UTC)
- Update: I see someone has added something in parentheses to clarify that several thousand petaflop/s-days are "a unit equivalent to approximately 1020 neural net operations". A thousand PFLOPS would be 1018 floating point operations a second. Or, since there's 86,400 seconds in a day, a petaflop would mean 8.64 × 1019 floating-point operations on a daily basis.
- After some digging through the edit history, I've found the edit that introduced the odd writing. Below is the prose it replaced, but I've modified the notes and citations to prevent them from adding a list to the bottom of the talkspace:
- Lambda Labs estimated that GPT-3 would cost US$4.6M and take 355 GPU years to train using state-of-the-art[b] GPU technology.[64] Another source lists training costs of US$12M and memory requirement of 350GB on an undisclosed hardware configuration.[65] Yet another estimate by Intento calculated that GPT-3 training would take 1 or 2 months[c] and might consume 432 MWh (1,555 GJ) of electricity if run 24/7. [66]
- If the overwriting sentence was supposed to specify how many PFLOPS and days "of compute [sic]" it took to pre-train GPT-3, then petaflops and days should be separate words with separate amounts. -- MrPersonHumanGuy (talk) 18:44, 30 August 2020 (UTC)
- Update 2: To quote the source the clarifier cited;
- A petaflop/s-day (pfs-day) consists of performing 1015 neural net operations per second for one day, or a total of about 1020 operations.
- That is from the second footnote, which is for this sentence:
- The total amount of compute, in petaflop/s-days,[2] used to train selected results that are relatively well known, used a lot of compute for their time, and gave enough information to estimate the compute used.
- I think it's a bit funny how the author(s) of the OpenAI blog AI and Compute used the word compute in place of computation all over the place, as if the verb is also a common noun. -- MrPersonHumanGuy (talk) 12:26, 31 August 2020 (UTC)
- Update 2: To quote the source the clarifier cited;
- I'm a little late to the game, so this response is for all those students, science and non-science. @MrPersonHumanGuy, there is a reason why your high school science or chemistry teacher emphasized and stressed always paying attention to the use of units in calculations.
- It is quite common in technical, and particularly science fields, to have complex units (qualifiers): foot–pounds vs. newton–meters. That is a units that are other than simple: inch, gallon, ton, calorie. So your misapprehension is probably a lack of exposure.
- Firstly the OpenAI terminology Petaflop/s-day(sic), and pfs-day(sic). The notation is misleading, the "/" (division) should have been a dash as in a complex unit, and s-day, the dash should have been a "/" divisor, i.e. sec/day.
- Pardon the scientific notation. Petaflop is understood to be 1 executed computer op-code with qualifier 10^15 per second, and s-days(sic) would be 8.64 * 10^4 seconds/day (i.e. 60 sec/min * 60 min/hr * 24 hr/day = 86,400 sec/day ).
- So 1 petaflop–s-day = (1 * 10^15 op/sec) * (8.64 * 10^4 sec/day). Which reduces to 8.64 * 10^19 op/day. Approximately 10^20 op/day. Q.E.D. WurmWoodeT 02:23, 12 January 2022 (UTC)
- Wow, wouldn’t it be simpler to just, say, explain whether you want to express a RATE or just a certain number of (floating-point) operations? If the rate is important, what does it matter if it’s per second or per day? So x petaflops per second is just 86,400 times that if it’s per day. That’s not a deep concept! And if it’s just the number of operations, just express it that way. It’s a pure number of flops. Why is all that nonsense about dashes and slashes an issue? You’re not taking an area under a curve (e.g., something like “passenger-miles.”) I don’t get it. It’s unhelpful to complexify fps for a day and talk about flops per second for a day and express it as “second-days”.. that kind of dimensional expression is only needed for incompatible units. Days and seconds are certainly NOT incompatible.Roricka (talk) 17:42, 21 November 2023 (UTC)
Re deletion of the reference to the sex-bot article
Regarding deletion of the reference to the article "Re: Sex-Bots -- Let Us Look Before We Leap" ( http://www.mdpi.com/2076-0752/7/2/15 ), several points are in order.
First, the journal in which the article appears is relatively new, but it is not obscure, having recently published, for example, two articles by tech industry heavyweights -- "Can Computers Create Art?" ( http://www.mdpi.com/2076-0752/7/2/18 ) and "Art in the Age of Machine Intelligence" ( http://www.mdpi.com/2076-0752/6/4/18 ) -- and which have enjoyed between them some 9,400 page views.
And yes, the article in question is an opinion piece -- but at this point in time, opinion is all we have; i.e., there is no one who can say with authority where AI is leading us, much less AI-enabled sex-bots! So if someone -- and that someone, BTW, is yours truly, although I don't think I've broken any of the WP:SELFCITE guidelines -- takes the time to express his concerns in a carefully thought-out and articulated piece, and if that piece is in turn given careful scrutiny before being published -- and yes, "Let Us Look Before We Leap" underwent a thorough peer review at Arts, even though published by them as "Opinion" -- what more could we expect from a source cited in Wikipedia regarding the quite critical and quite speculative subject of AI?
And finally, regarding the argument that this Wikipedia article should be about the company and not AI in general, the fact is that OpenAI has, by its very charter, captured the subject of the desirability of requiring that all AI code to which the public is subject be open source (just as, for example, we now require public disclosure of the details of all pharmaceuticals), and thus likewise the quite understandable goal of someone who thinks that this is the correct approach: he has taken the time to articulate his arguments and have them published in a reputable journal; and he now wishes in turn to share them with a larger Wikipedia audience via said article.
Comments, please! I am obviously aiming at a re-instatement of the deleted content, but can certainly be dissuaded therefrom. Synchronist (talk) 04:06, 1 August 2018 (UTC)
- I'll suspend judgement then on whether it's obscure. I removed the content based on its not meeting WP:RS; I'm happy to solicit other opinions though. We can always ask WP:DRR/3O for a third opinion if nobody else on talk has any thoughts on the matter. Rolf H Nelson (talk) 17:19, 5 August 2018 (UTC)
- The use is inappropriate: it has no mention or apparent relevance to OpenAI, we can't put things together like this per WP:SYNTH. Whether someone wants to share it is irrelevant, this is about things that are related to OpenAI. Not just the use of the source, but the commentary "...one juried commentator has asked..." is not encyclopedic. K.Bog 01:32, 28 August 2018 (UTC)
Solving Rubik’s Cube with a robot hand
Just want to attract attention to a new article published by OpenAI October 15th, 2019, about how they made a system that learned to solve Rubik's Cube all by itself, using only one hand (a Shadow Dexterious Hand). Maybe someone wants to add a mention of this to the article. There is a blog post, Solving Rubik’s Cube with a Robot Hand, and a scientific paper of the same name: Solving Rubik’s Cube with a Robot Hand. --Jhertel (talk) 17:38, 19 October 2019 (UTC)
Removal of Controversy Section?
The section I added about controversy concerning OpenAI is completely warranted. I can assure you the creation of OpenAI LP has generated controversy. Again just last week with the prica announcement of GPT-3 the no longer open company structure of OpenAI is debated. Could you elaborate your reasons to remove the entire Controversy section? HaeB Diff here: http://en.wiki.x.io/w/index.php?title=OpenAI&diff=975914486&oldid=975824355
I'd like to add the announced pricing of GPT-3 to the controversy section as well but before doing that and getting it removed again. This needs resolving imho. I've seen it in many wikipedia articles that the controversial things about a subject are being repeated in that section so imho it doesn't warrant a complete errasure. Additionally it is true that they are still filing as a non-profit which is controversial given how non-transparent they have been lately. — Preceding unsigned comment added by Juliacubed (talk • contribs) 07:35, 4 September 2020 (UTC)
- I agree that turning for profit generated a lot of controversy and deserves a section. See https://techcrunch.com/2019/03/11/openai-shifts-from-nonprofit-to-capped-profit-to-attract-capital/ and https://www.technologyreview.com/2020/02/17/844721/ai-openai-moonshot-elon-musk-sam-altman-greg-brockman-messy-secretive-reality/ and https://www.wired.com/story/dark-side-big-tech-funding-ai-research/ Yannn11 16:17, 11 July 2021 (UTC)
There clearly should be a Controversy section. The name "OpenAI" is deliberately misleading - it suggests that all development is open source, which is clearly not the case. Removal of the Controversy section seems to me to have been vandalism. But now the page is protected so that it's difficult to add it back. Sayitclearly (talk) 11:32, 6 December 2022 (UTC)
- This page is no longer protected, so you should be able to add the section back in. I would, but as a leader of a competing org I have a clear COI.
- Stellaathena (talk) 20:58, 1 April 2023 (UTC)
For Profit owned by Non Profit? What?
This is a general encyclopedia for everyone. We need to explain this corporate/Organisation construct and who can possibly profit or not profit from this. The current article is bound to confuse, rather than to clear things up. Can we please get someone who knows about this legal construct and explain it? Thanks so much. --91.64.59.134 (talk) 20:48, 24 October 2021 (UTC)
Controversies
Hi GobsPint, I reverted [this edit] due to inaccuracies, overcitation, writing style, and because its primary goal seems to be to criticize rather than inform. I'm sorry if this is harsh, here are a few additional explanations :
- The original Time article from which other articles were inspired was pretty good and I don't think there is a need to have 7 citations for a single sentence. Having one single citation containing all the required information, when possible, is cleaner and makes verification easier. These redundant sources were already removed by PeaceSeekers, I don't think there is a good reason to add them back.
- From what I saw, "[...] was reviewed without psychosocial support" gives a false impression. What was said in the original Time article is that "sessions with “wellness” counselors [...] were unhelpful and rare due to high demands to be more productive at work". Moreover, these sessions were organised by Sama, not OpenAI.
- I don't see any attempt to integrate your content with the existing text. The text was just added, however redundant it is with what is already written on the controversies section. "OpenAI has been criticized for outsourcing [...]" is written twice.
- Writing the statement that a company has a "hypocritical approach" is implying that there is a proven disingenuity in OpenAI's approach, which not obvious and is making a subjective judgment. The given source says that, but it doesn't look dispassionate. News articles can write about personal opinions unlike Wikipedia.
Alenoach (talk) 14:18, 27 August 2023 (UTC)
The original text is an accurately sourced succinct summation of the controversy. The replacement text's verbosity is WP:TLDR and WP:UNDUE, relies on WP:PRIMARY sources through secondary source exclusion, and only lists exposure to textual sexual violence, neglecting the to mention exposure to other graphic textual content.The fact that other sections in the article routinely utilize more than a single reference (which is the norm), while the proposed controversy text does not (despite being controversial) suggests this is a WP:WHITEWASH.GobsPint (talk) 17:06, 2 September 2023 (UTC)
- Hi GobsPint,
- Investigative journalism like in the [Time article] doesn't seem typically considered as a primary source (see this comment). This can be confusing I agree, but the Time article analyzes, interprets and synthesizes information from primary sources such as interviews and documents.
- I agree that the topic received significant media coverage, and I don't think having 7 references is needed to prove that. Nevertheless, I think it's ok to have 2 or even 3 references if it adds substantial information or if none of them is sufficient alone to prove the point. It just seems to me that most of these articles are derived from the Time article, which is sufficiently reliable to support what is written and gives a good overview of the topic.
- The article also covers an image labeling contract used for image generation. It could be worth mentioning it, while briefly clarifying (correct me if I'm wrong) that for the text the datasets were coming from OpenAI, and not for the images. "graphic textual content" may be ambiguous, so it looks better to use explicit terms like "textual descriptions" or "images".
- The clarifications about the salaries of the Kenyan workers make the paragraph longer, but I think it is notable because a lot of articles tend to focus on it. I also think that it is important to give the context and to clarify the distinction between what OpenAI has done and what Sama has done. Alenoach (talk) 14:37, 3 September 2023 (UTC)
OpenAI has been criticized for outsourcing the curation of its data sets for active learning to the developing world where graphic textual content involving child sexual abuse, bestiality, incest, murder, racism, suicide, self harm, torture, and violence was reviewed without psychosocial support causing reviewers to develop post-traumatic stress disorder.[1][2][3][4][5][6][7]
OpenAI has been criticized for it's hypocritical approach of scraping content under the guise of fair use for its machine learning, but forbidding that its output be used to train other AI models.[8]
- ^ Perrigo, Billy (18 January 2023). "Exclusive: The $2 Per Hour Workers Who Made ChatGPT Safer". Time. Retrieved 5 August 2023.
- ^ Rowe, Niamh (2 August 2023). "'It's destroyed me completely': Kenyan moderators decry toll of training of AI models". The Guardian. Retrieved 5 August 2023.
- ^ Njanja, Annie (14 July 2023). "Workers that made ChatGPT less harmful ask lawmakers to stem alleged exploitation by Big Tech". TechCrunch. Retrieved 5 August 2023.
- ^ Harrison, Maggie (Jan 20, 2023). "OpenAI Apparently Paid People in the Developing World $2/Hour to Read About Bestiality". Futurism. Retrieved 5 August 2023.
- ^ Hao, Karen; Seetharaman, Deepa (24 July 2023). "Cleaning Up ChatGPT Takes Heavy Toll on Human Workers". Wall Street Journal. Retrieved 5 August 2023.
- ^ Xiang, Chloe (18 January 2023). "OpenAI Used Kenyan Workers Making $2 an Hour to Filter Traumatic Content from ChatGPT". Vice. Retrieved 5 August 2023.
- ^ published, Jacob Ridley (20 January 2023). "To make an AI chat bot behave, Kenyan workers say they were 'mentally scarred' by graphic text". PC Gamer. Retrieved 5 August 2023.
- ^ Barr, Alistair. "AI hypocrisy: OpenAI, Google and Anthropic won't let their data be used to train other AI models, but they use everyone else's content". Business Insider. Retrieved 26 August 2023.
- The Times article is a primary source, and should not be the sole reference. Since this is a controversy, it should be well-sourced.
- "Without pyschosocial support" is literal verbatim from the subsequent workers petition, as listed in the references.
- In the petition, the former workers of Sama say training the ChatGPT model involved reading and viewing material that depicted sexual and graphic violence, and categorising it accordingly, so that the AI could learn it for safety purposes in its future interactions with people. "Throughout the contract of training ChatGPT, we were not afforded psychosocial support. Due to the exposure to this kind of work, training ChatGPT, we have developed severe mental illnesses, including PTSD, paranoia, depression, anxiety, insomnia, sexual dysfunction, to mention a few,” the petition reads." [1] [2]
- The proposed text is excessively verbose. I was unable to integrate the summation into the garrulous text.
- Excluding a viewpoint by labeling the reference is "dispassionate" is not in line with a WP:NPOV. The premise is of the criticism by the source is straightforward. OpenAI may consume any data under fair use to , but nobody can use OpenAI output for the same purpose, i.e. wikt:hypocrite.
GobsPint (talk) 17:06, 2 September 2023 (UTC)
- I think the "wellness sessions" probably count as a form of "psychosocial support", and that it's not clear how much OpenAI should be blamed for the wellness sessions of the employees of Sama being "unhelpful and rare due to high demands to be more productive at work". About the doctrine of fair use, accusing any entity or person of hypocrisy is something to avoid in Wikipedia in general. But the controversy around "fair use" seems notable enough to be mentioned. For my response about the Time article and the verbosity, see my comment above. Alenoach (talk) 14:45, 3 September 2023 (UTC)
OPENAI QUIETLY DELETES BAN ON USING CHATGPT FOR “MILITARY AND WARFARE” https://theintercept.com/2024/01/12/open-ai-military-ban-chatgpt/ — Preceding unsigned comment added by Midgetman433 (talk • contribs) 22:27, 13 January 2024 (UTC)
A Commons file used on this page or its Wikidata item has been nominated for deletion
The following Wikimedia Commons file used on this page or its Wikidata item has been nominated for deletion:
Participate in the deletion discussion at the nomination page. —Community Tech bot (talk) 14:37, 6 May 2022 (UTC)
Unnecessary emphasis on Elon Musk?
I think this page refers to Elon Musk somewhat gratuitously. In particular, it seems unnecessary to feature a relatively large portrait of Musk next to a classification of the article as belonging to a series related to Musk, and linking to a page with his honors and achievements. Musk was one of several co-founding donors to the openai project, and no longer has any involvement with it. I think it would be appropriate to remove the photo of Musk, and link to his honors and achievements. Nickstudenski (talk) 19:34, 10 June 2022 (UTC)
- I agree and removed "Elon Musk series." Yannn11 18:30, 11 June 2022 (UTC)
- The thehindu-ref for Musk in the infobox is fishy (date from 2015 with a bunch of updates), topic is him founding a rival AI. The role of Musk for me stays unclear after reading the wiki here. He talks a lot, so he is cited a lot. Besides the money and a seat on the board, what did he actually do. He didn't have that much influence, since he left. Or did he leave because of M$'s investment? There should be more to learn about it here. MenkinAlRire 21:22, 30 April 2023 (UTC)
Greg Brockman page
surely time for a wiki article on him. why not? he's an important player in OpenAI and thus in AI development. https://openai.com/blog/authors/greg/ https://www.forbes.com/profile/greg-brockman/ https://csuitespotlight.com/2022/08/23/ivy-league-dropout-greg-brockman-is-leading-the-ai-revolution/ JCJC777 (talk) 13:10, 4 January 2023 (UTC)
Definitely — Preceding unsigned comment added by Skogkatt88 (talk • contribs) 01:47, 6 February 2023 (UTC)
Kellycoinguy (talk) 11:03, 8 February 2023 (UTC) has he asked not to have a page?? Surely he's notable enough by now.
- He had a page, but it was deleted twice (1st nomination, 2nd nomination). The page is now protected from creation, so only administrators can create it. There is a draft version at Draft:Greg Brockman, which was submitted for review on 15 March 2023. It may take many months before the draft gets actually reviewed. --Lambiam 17:19, 17 April 2023 (UTC)
The redirect Triton (programming language) has been listed at redirects for discussion to determine whether its use and function meets the redirect guidelines. Readers of this page are welcome to comment on this redirect at Wikipedia:Redirects for discussion/Log/2023 September 12 § Triton (programming language) until a consensus is reached. TartarTorte 23:29, 12 September 2023 (UTC)
OpenAI and Jony Ive in talks to raise $1bn from SoftBank for AI device venture
Probably too soon to include in article (per WP:RECENTISM), but something to keep an eye on: OpenAI and Jony Ive in talks to raise $1bn from SoftBank for AI device venture. -- Cl3phact0 (talk) 08:20, 28 September 2023 (UTC)
OpenAI and Open Source
The first paragraph says at the end:
"While company started under premise of open source, its foundation now effectively oppose idea of open source AI."
Nobody has confirmed the case of OpenAI opposing or supporting open source AI is true, so this is misleading at best and undermines credibility.
It should be changed to something along the lines of: "While the company was initially founded as a non-profit organization for advancing artificial intelligence, it has since restructured into a capped-profit organization."
The open-source parts and how that relates to AI safety should be moved somewhere below that.
Zenulabidin2k (talk) 11:34, 19 November 2023 (UTC)
Ownership in infobox
ProcrastinatingReader and Artem.G have removed the ownership field from the infobox.
ProcrastinatingReader's justification for removing the information was "misleading / inaccurate, due to openai's complex ownership structure". OpenAI may have an unusual governance structure, but it is an undisputed fact that 49% of the company is owned by Microsoft. To me, it is a no-brainer that the ownership information be included there.
What do editors think? 20WattSphere (talk) 01:53, 28 January 2024 (UTC)
- because, as I've said, it's oversimplified version of facts. Here is another source that says that Microsoft doesn't "own" OpenAI [3], And here is another [4]
However, OpenAI’s structure means it is not the entity with the greatest investment in the company that chooses what happens as it is not a publicly traded company. OpenAI’s board controls the direction of the company not principal investor Microsoft.
- So yes, Microsoft owns shares and is its principal investor, but it's a bit more nuanced than "own OpenAI". Artem.G (talk) 07:48, 28 January 2024 (UTC)
- I think I agree with removing it, and I'm not so sure the 49% figure is undisputed.
- The NYT article that was removed unambiguously says they own 49%, but the Verge article cited in the body says rumors suggest that they will own 49% once they get their investment back - even if the rumor is true that's not the same thing as owning 49% now. I found quite a few articles saying some version of this, many attributing it to this Semaphor article [5] from before the deal closed, which emphasizes that the terms could change. It seems like nobody outside OpenAI and Microsoft knows the exact terms.
- I also see how having it in the infobox would be misleading. A casual reader could easily get the idea that Microsoft only needs to win over 1 other investor to oust the CEO, which isn't true. As we learned a few months ago only the non-profit board can do that, and the deal didn't give Microsoft board seats (even after the shakeup they just have 1 non-voting seat). OpenAIs structure breaks common assumptions about what a 49% stake means, so it should probably just be discussed in the body. Jamedeus (talk) 08:20, 28 January 2024 (UTC)
- OpenAI is a non-profit with a mission-driven purpose, therefore it cannot have any owners and it is managed by a board of trustees.
- OpenAI give this ownership diagram. It sounds like "OpenAI Global" is an entity that's wholly controlled by OpenAI and its board, and economically owned by Microsoft and by a holding company composed of OpenAI + its employees individually. It also seems like Microsoft's stake gives it little decision-making capability, which is wholly controlled by the OpenAI board. There's also a clause in their agreement that more-or-less ends their partnership when OpenAI's board determines they've reached AGI. And the recent events where Microsoft was uninformed of the firing of the CEO, and unable to directly reverse the decision, shows limits to their power.
- These reasons just compound why it's too misleading to say it's owned 49% by Microsoft, but the fundamental reason really is that, as a non-profit, OpenAI does not have shares and cannot have any owners. ProcrastinatingReader (talk) 13:45, 28 January 2024 (UTC)
- Okay, I think you are right to remove it.
- Ironically, I am a perfect demonstration of your point, because my belief they were clearly 49% owned by Microsoft was probably gained from this very infobox. Apologies! 20WattSphere (talk) 09:00, 29 January 2024 (UTC)
"OpenAI." listed at Redirects for discussion
The redirect OpenAI. has been listed at redirects for discussion to determine whether its use and function meets the redirect guidelines. Readers of this page are welcome to comment on this redirect at Wikipedia:Redirects for discussion/Log/2023 November 6 § OpenAI. until a consensus is reached. Gonnym (talk) 12:16, 6 November 2023 (UTC)
Wiki Education assignment: Research Process and Methodology - FA23 - Sect 202 - Thu
This article was the subject of a Wiki Education Foundation-supported course assignment, between 6 September 2023 and 14 December 2023. Further details are available on the course page. Student editor(s): CORNELIAST (article contribs).
— Assignment last updated by CORNELIAST (talk) 18:55, 16 November 2023 (UTC)
Tasha McCauley page
Wanna help with Draft:Tasha_McCauley? Warrants an article? RudolfoMD (talk) 03:34, 20 December 2023 (UTC)
headquarters (2023)
infobox box (image caption) says pioneer building is "former headquarters". Body says it is still the current headquarters. can we get these in sync? which is correct? thanks, skakEL 16:29, 25 November 2023 (UTC)
- I'm gonna remove "former" from the image caption. If someone has the facts & wants to change it back, cool. But if you do, please change it in the body of the article too. Thanks. skakEL 15:03, 27 November 2023 (UTC)
Kara Swisher says she has a scoop
I am not a good enough Wikipedian to know off the cuff whether these tweets are noteworthy or sufficiently reliable in her voice for inclusion, but my longstanding impression is a lot of Silicon Valley goes to Kara Swisher when they want to leak, and her leak reporting is nearly flawless, unlike her op/eds. So I offer these excerpts for consideration by the more steeped in such questions: "it was a 'misalignment' of the profit versus nonprofit adherents at the company. The developer day was an issue.... Sources tell me that the profit direction of the company under Altman and the speed of development, which could be seen as too risky, and the nonprofit side dedicated to more safety and caution were at odds. One person on the Sam side called it a “coup,” while another said it was the the right move." Sandizer (talk) 03:11, 18 November 2023 (UTC)
Nomination for deletion of Template:OpenAI
Template:OpenAI has been nominated for deletion. You are invited to comment on the discussion at the entry on the Templates for discussion page. InfiniteNexus (talk) 00:15, 11 December 2023 (UTC)
Nomination for deletion of Template:OpenAI
Template:OpenAI has been nominated for deletion. You are invited to comment on the discussion at the entry on the Templates for discussion page. InfiniteNexus (talk) 00:15, 11 December 2023 (UTC)