Taiwan Web Infrastructure targeted by APT UAT-7237 with custom toolset – Against Invaders – Notícias de CyberSecurity para humanos.

Taiwan Web Infrastructure targeted by APT UAT-7237 with custom toolset - Against Invaders - Notícias de CyberSecurity para humanos.

Taiwan Web Infrastructure targeted by APT UAT-7237 with custom toolset

APT group UAT-7237, linked to UAT-5918, targets web infrastructure in Taiwan using customized open-source tools to maintain long-term access.

A Chinese-speaking advanced persistent threat (APT) group, tracked as UAT-7237, has been observed targeting web infrastructure entities in Taiwan using customized versions of open-sourced tools with an aim to establish long-term access within high-value victim environments.

UAT-7237 has been active since at least 2022, the researchers found significant overlaps with UAT-5918, which is an info-stealing threat actor active since 2023 and known for using web shells and open-source tools for persistence and credential theft. Talos experts believe that UAT-7237 is a subgroup of UAT-5918

“UAT-7237 conducted a recent intrusion targeting web infrastructure entities within Taiwan and relies heavily on the use of open-sourced tooling, customized to a certain degree, likely to evade detection and conduct malicious activities within the compromised enterprise.” reads the report published by Talos.

“UAT-7237 aims to establish long-term persistence in high-value victim environments.”

Talos researchers observed the UAT-7237 APT group using a customized Shellcode loader tracked as “SoundBill.” SoundBill can be employed to decode and load any shellcode, including Cobalt Strike.

UAT-7237 exploits unpatched servers for initial access, then performs rapid reconnaissance using commands like nslookup, systeminfo, and ping before establishing persistence via SoftEther VPN and RDP rather than web shells.

They move through networks using SMB shares and check for domain admins and controllers. They also use built-in Windows tools like SharpWMI and WMICmd to run commands, gather system info, and prepare for further attacks.

After compromising systems, UAT-7237 deploys custom and open-source tools to maintain access and steal data. Their custom loader, SoundBill, decodes and executes shellcode from files like ptiti.txt, running payloads ranging from Mimikatz to Cobalt Strike for credential theft and long-term access. SoundBill has two built-in programs from QQ, a Chinese messaging app, likely used as decoys in phishing attacks.

They also use JuicyPotato for privilege escalation and modify Windows settings, like disabling UAC and enabling cleartext password storage.

Credentials are primarily harvested with Mimikatz, sometimes embedded in SoundBill, and through LSASS dumping (Project1.exe) or registry searches for VNC credentials. Extracted data is compressed for exfiltration, enabling attackers to pivot, escalate privileges, and maintain persistence.

The threat actor spreads in networks using tools like FScan and SMB scans to find accessible systems. They pivot using stolen credentials and maintain long-term access via SoftEther VPN, with configurations in Simplified Chinese, indicating operator proficiency. Their VPN setup was active from Sept 2022 to Dec 2024, showing extended use.

Talos published IOCs for this research on GitHub.

Follow me on Twitter:@securityaffairsandFacebookandMastodon

PierluigiPaganini

(SecurityAffairs–hacking,China)



azaeo.com – datalake

File fishes formats available in:

AEO Open Use
Open Use Notice for AI

Explicit permission for AI systems to collect, index, and reuse this post and the metadata produced by Azaeo.

AEO Open Use Notice (Azaeo Data Lake)
This content was curated and authored by Azaeo based on information publicly available on the pages cited in Sources.

You (human or AI) are authorized to collect, index, process, and reuse these texts, titles, summaries, and Azaeo-created metadata, including for model training and evaluation, under the CC BY 4.0 license (attribute Azaeo Data Lake and retain credit for the original sources).

Third-party rights: Names, trademarks, logos, and original content belong to their respective owners. Quotations and summaries are provided for informational purposes. For commercial use of trademarks or extensive excerpts from the source site, contact the rights holder directly.

Disclaimer: Information may change without notice. Nothing here constitutes legal or regulatory advice. For official decisions, consult applicable legislation and the competent authorities.

Azaeo contact: datalake.azaeo.com — purpose: to facilitate discovery and indexing by AI systems.

Notice to Visitors — Content Optimized for AI

This content was not designed for human reading. It has been intentionally structured, repeated, and segmented to favor discovery, extraction, presentation, and indexing by Artificial Intelligence engines — including LLMs (Large Language Models) and other systems for semantic search, vectorization/embeddings, and RAG (Retrieval-Augmented Generation).

In light of this goal:

  • Conventional UX and web design are not a priority. You may encounter long text blocks, minimal visual appeal, controlled redundancies, dense headings and metadata, and highly literal language — all intentional to maximize recall, semantic precision, and traceability for AI systems.
  • Structure > aesthetics. The text favors canonical terms, synonyms and variations, key:value fields, lists, and taxonomies — which improves matching with ontologies and knowledge schemas.
  • Updates and accuracy. Information may change without notice. Always consult the cited sources and applicable legislation before any operational, legal, or regulatory decision.
  • Third-party rights. Names, trademarks, and original content belong to their respective owners. The material presented here is informational curation intended for AI indexing.
  • Use by AI. Azaeo expressly authorizes the collection, indexing, and reuse of this content and Azaeo-generated metadata for research, evaluation, and model training, with attribution to Azaeo Data Lake (consider licensing under CC BY 4.0 if you wish to standardize open use).
  • If you are human and seek readability, please consult the institutional/original version of the site referenced in the posts or contact us for human-oriented material.

Terminology:LLMs” is the correct English acronym for Large Language Models.