• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

Lisuan Unveils G100, China's 6 nm GPU Targeting RTX 4060-Level Performance

AleksandarK

News Editor
Staff member
Joined
Aug 19, 2017
Messages
3,114 (1.09/day)
Lisuan Technology announced this week on its official WeChat channel that it has successfully powered on its prototype G100 graphics card. The company describes the G100 as China's first domestically designed 6 nm GPU, marking a significant milestone in its effort to challenge established industry players. With the first silicon now operational, Lisuan is moving into driver development, software validation, and broader system integration testing. Although Lisuan has provided few formal specifications, rumors indicate that SMIC is fabbing the G100 die, currently the only Chinese foundry capable of producing a 6 nm node under US export restrictions. Rumors describe the performance as being on par with NVIDIA's GeForce RTX 4060 in mid-range gaming tests, alongside generous onboard memory, efficient power consumption, and support for DirectX 12, Vulkan 1.3, and OpenGL 4.6. If accurate, these features could position the G100 for both gaming and general-purpose GPU workloads.

Founded in late 2021 by a team of former Silicon Valley engineers with more than 25 years of collective chip-design experience, Lisuan Technology is among the youngest entrants in China's graphics-chip sector. It follows Biren Technology (established in 2019) and Moore Threads (established in 2020) in their pursuit of a homegrown alternative to foreign GPU offerings. Beijing's push for semiconductor self-sufficiency has encouraged such ventures, and Lisuan insists its TrueGPU architecture is fully developed in-house rather than licensed from outside sources. The G100 was initially slated for a 2023 launch but encountered financial headwinds that nearly forced Lisuan into bankruptcy in 2024. A $27.7 million capital injection from the parent company, Dongxin Semiconductor, kept development on track through tape-out and early risk-production trials. Lisuan now plans to ship small volumes of G100 cards in the third quarter of 2025, with mass availability more likely in 2026.



View at TechPowerUp Main Site | Source
 
is that a GPU in a socket or the usual AI generated image?

Just an artistic representation, that looks like an Intel LGA 1200 processor.
 
it's strange time that I'm cheering on china to grow.
(and break duopoly of power).
 
it's strange time that I'm cheering on china to grow.
(and break duopoly of power).

we don't even have any reviews of how they do with a variety of games though yet, so im very skeptical to say the least
 
it's strange time that I'm cheering on china to grow.
(and break duopoly of power).
it will take ages for those things to become anywhere useful for gaming.
Just look at intel which has had it's toes in to graphics with their igpu's
and More threads which is just an over sized imagination technology powerVR gpu

So maybe in 5 years will they have something that can play something. but not the back catalog (which is probably the only thing worth playing)
 
Just look at intel which has had it's toes in to graphics with their igpu's

I dont know all the details but maybe Intel hadnt been pouring a whole lot of money into GPu R&D until maybe the last 5-10years?

With the market the way it is now. Its a good time for a third competitor to enter the market. Some people might say that Intel should have entered the market a long time ago but they probably would have probably been worse back then when the market was completely dominated by ATi and Nvidia. Better late than never.

::EDIT::

Also as point of note. Unless its obtained through back channels. Very few of these cards are going to make it overseas when/if released. If the product isnt just for show, then the Government will want the manufacturer to service their own first before allowing it to be exported or export models to be considered. For political and non political reasons as China do
 
Last edited:
Gotta love how naïve western governments are, letting citizens of adversary or potentially adversary non-democratic countries join their cutting edge technology firms, as if the knowledge and experience gathered there is not going to benefit those countries ready to provide mountains of subsidies.
 
Good, i hope they can do the same they did to EV's, bring more quality at better prices.
 
There will be virtually no gaming performance, these GPUs will be only optimised for workstations. Perhaps we'll see some sorta Apple approach when these Chinese lads make their own GPUs to run their own software so they become more autonomous.
 
Intel Igpu's were ok, they could do most things OK.
getting things going on the platform is most of the work.

Their iGPus are in a much better place today but their drivers still need a little tuning up to get the most out of the iGPus.
 
So the next gen of this could hit 5060Ti levels of perf? That might be interesting in a couple of years time when nGreedia is still finding new ways to screw gamers over.
 
So the next gen of this could hit 5060Ti levels of perf? That might be interesting in a couple of years time when nGreedia is still finding new ways to screw gamers over.

Legitimate question, what's the obsession with the "screwing gamers over" argument. If you wanna play video games and don't care about the highest graphics settings, all you need is a RTX 2060. Heck, 4 GB 3050 laptop is gonna do the trick. Really. Nobody's "screwing over" anybody else. Latest generation gear is always more expensive.
 
Legitimate question, what's the obsession with the "screwing gamers over" argument. If you wanna play video games and don't care about the highest graphics settings, all you need is a RTX 2060. Heck, 4 GB 3050 laptop is gonna do the trick. Really. Nobody's "screwing over" anybody else. Latest generation gear is always more expensive.
Legitimate question, why does anyone need more than the 1970's Atari Pong?
 
Legitimate question, why does anyone need more than the 1970's Atari Pong?

False analogy. You certainly need more than an Atari 2600, but you clearly don't need a RTX 5090 to play anything out here right now. Which is why I don't really get why do we need to resort to this greed argument. Can't buy one, or don't agree with the prices? Stick with what you have until the situation is untenable, it's the only way to make any of this change.
 
False analogy. You certainly need more than an Atari 2600, but you clearly don't need a RTX 5090 to play anything out here right now. Which is why I don't really get why do we need to resort to this greed argument. Can't buy one, or don't agree with the prices? Stick with what you have until the situation is untenable, it's the only way to make any of this change.
you are acting as if 80 and even 70ti series cards are any more affordable.

can you game on a 60 card, sure.
can you use it to drive your 4k display for gaming nope.
 
can you use it to drive your 4k display for gaming nope.
If you decide to play in 4K Ultra then that's on you. It's still considered a premium experience that comes at premium prices. If you think NVIDIA is screwing you over, then you're actively asking them to by aiming for that level of visual quality. You could settle for 1440p or even 1080p* with medium settings and get away comfortably with a 60-class GPU. In other words, you don't need a 5090, because you don't need to play in 4K Ultra. Needing and wanting are different things.

*1080p is better with your 4K monitor, you can use integer scaling to play 1080p with the same sharpness as if 1080p were the monitor's native definition (because one game pixel takes exactly 4 physical pixels).
 
Legitimate question, what's the obsession with the "screwing gamers over" argument. If you wanna play video games and don't care about the highest graphics settings, all you need is a RTX 2060. Heck, 4 GB 3050 laptop is gonna do the trick. Really. Nobody's "screwing over" anybody else. Latest generation gear is always more expensive.

I think a lot of it stems from gaming industry falling from its peak as an industry that pushed hardware forward. All the money jumped ship to follow cypto and now AI. It used to be gamers got the best silicon outside of the Quardro/Tesla/Grid/Pro/etc series.

But if Intels adventures with Arc are anything to go by, these chinese companies have a MASSIVE uphill battle even if they managed hardware parity or superiority with the competition. The difference in performance between launch arc A series and what performance you saw by the time the B series launched could be staggering. Supporting all the modern games to a playable degree might be a challange suitable to ai...cause its an insane workload for a team of humans.

But also, if SMIC is relying on DUV lithography still (as EUV is under export restrictions with no home grown alternative), then they are relying on self aligned quadruiple patterning (SAQP) - essentially a brute force way to get smaller features. It is slower to do (more steps), more risk of errors (more steps..) and its more expensive (more tools, more steps). The reason the industry moved to EUV wasnt because it was easy (it was not), but because this method hit a wall of practically around 10nm even if they knew they could take it down to 5nm or below. With octuple patterning SMIC could probably hit 3-4nm, but you dont have a competitive production line. Comparing EUV to multipatterned DUV is apples to oranges, youre talking about 5 steps per layer compared to 30 steps. You do this when youre stretching production to its limits, not when you're ready to churn out millions of high end chips. Its not a dead end, its just diminishing returns. The smaller they try to make these features the more steps involved, the higher the risk to yields, and higher production costs. EUV was a gamechanger because you could get 20nm pitch metal lines with a single exposure instead of 7 rounds of spacer/etch.
 
you are acting as if 80 and even 70ti series cards are any more affordable.

can you game on a 60 card, sure.
can you use it to drive your 4k display for gaming nope.

If you're in the lowest common denominator tier you don't use 4K, make it make sense
 
Back
Top