<?xml version="1.0" encoding="utf-8"?>
<feed xmlns="http://www.w3.org/2005/Atom">
  <title>Seungeon Lee</title>
  <subtitle>Seungeon Lee's portfolio site — AI-driven medium-sized peptide drug discovery</subtitle>
  <link href="https://flansma.github.io/feed.xml" rel="self"/>
  <link href="https://flansma.github.io/"/>
  <updated>2026-03-09T00:00:00+09:00</updated>
  <id>https://flansma.github.io/</id>
  <author>
    <name>Seungeon Lee</name>
    <email>lee.seungeon.62y@st.kyoto-u.ac.jp</email>
  </author>
  
  <entry>
    <title>ゼロからわかるChatGPT — Practical AI Adoption for Healthcare Teams</title>
    <link href="https://flansma.github.io/talks/2026-03-09-yokawa-chatgpt/"/>
    <id>https://flansma.github.io/talks/2026-03-09-yokawa-chatgpt/</id>
    <updated>2026-03-09T00:00:00+09:00</updated>
    <published>2026-03-09T00:00:00+09:00</published>
    <category term="talks"/>
    <summary>Invited lecture at Yokawa Hospital (Hyogo) for ~60-70 staff covering practical LLM adoption for healthcare workflows, custom GPTs, and information security.</summary>
    <content type="html">&lt;h2 id=&quot;overview&quot;&gt;Overview&lt;/h2&gt;

&lt;p&gt;Invited lecture at Yokawa Hospital (医療法人社団 敬命会 吉川病院) in Miki, Hyogo, for approximately 60–70 staff members. Systematically covered the full scope of AI adoption for healthcare organizations: from LLM fundamentals and prompt engineering to department-level workflow integration, organizational deployment via Custom GPTs, and information security policy design. Delivered in Japanese.&lt;/p&gt;

&lt;h2 id=&quot;topics&quot;&gt;Topics&lt;/h2&gt;

&lt;ul&gt;
  &lt;li&gt;LLM operating principles and output characteristics (hallucination, context window constraints)&lt;/li&gt;
  &lt;li&gt;Prompt engineering: role assignment, output format specification, structured constraints&lt;/li&gt;
  &lt;li&gt;Department-level workflow integration: administration (automated minutes formatting, email drafting, internal notice generation), nursing/clinical (simplifying patient IC materials, English literature summarization, conference abstract structuring), education/planning (training material creation, multilingual support)&lt;/li&gt;
  &lt;li&gt;Live demo: multi-tone generation of meeting minutes and internal announcements&lt;/li&gt;
  &lt;li&gt;Mapping AI application points across the full operational workflow (phased adoption approach)&lt;/li&gt;
  &lt;li&gt;Context management and knowledge base construction via Projects&lt;/li&gt;
  &lt;li&gt;Quality standardization across the organization through Custom GPTs design and distribution&lt;/li&gt;
  &lt;li&gt;Information security: personal/sensitive data handling policies, data control settings, output approval workflows&lt;/li&gt;
&lt;/ul&gt;

&lt;h2 id=&quot;qa&quot;&gt;Q&amp;amp;A&lt;/h2&gt;

&lt;p&gt;Also discussed application of other AI tools including CLOVA Note in healthcare settings.&lt;/p&gt;
</content>
  </entry>
  
  <entry>
    <title>Empowering federated learning for robust compound–protein interaction prediction across heterogeneous cross-pharma domains</title>
    <link href="https://flansma.github.io/publications/2026-01-10-cpi-federated/"/>
    <id>https://flansma.github.io/publications/2026-01-10-cpi-federated/</id>
    <updated>2026-01-10T00:00:00+09:00</updated>
    <published>2026-01-10T00:00:00+09:00</published>
    <category term="publications"/>
    <summary>Empowering federated learning for robust compound–protein interaction prediction across heterogeneous cross-pharma domains — Journal of Cheminformatics</summary>
    <content type="html">
</content>
  </entry>
  
  <entry>
    <title>HELM-BERT: A Transformer for Medium-sized Peptide Property Prediction</title>
    <link href="https://flansma.github.io/publications/2025-12-30-helm-bert-jcim-submitted/"/>
    <id>https://flansma.github.io/publications/2025-12-30-helm-bert-jcim-submitted/</id>
    <updated>2025-12-30T00:00:00+09:00</updated>
    <published>2025-12-30T00:00:00+09:00</published>
    <category term="publications"/>
    <summary>HELM-BERT: A Transformer for Medium-sized Peptide Property Prediction — Journal of Chemical Information and Modeling (JCIM)</summary>
    <content type="html">
</content>
  </entry>
  
  <entry>
    <title>HELM-BERT: A Transformer for Medium-sized Peptide Property Prediction</title>
    <link href="https://flansma.github.io/preprints/2025-12-29-helm-bert/"/>
    <id>https://flansma.github.io/preprints/2025-12-29-helm-bert/</id>
    <updated>2025-12-29T00:00:00+09:00</updated>
    <published>2025-12-29T00:00:00+09:00</published>
    <category term="preprints"/>
    <summary>HELM-BERT: A Transformer for Medium-sized Peptide Property Prediction — arXiv</summary>
    <content type="html">
</content>
  </entry>
  
  <entry>
    <title>HELM-BERT: A Transformer for Medium-sized Peptide Property Prediction</title>
    <link href="https://flansma.github.io/talks/2025-10-29-helm-bert-cbi/"/>
    <id>https://flansma.github.io/talks/2025-10-29-helm-bert-cbi/</id>
    <updated>2025-10-29T00:00:00+09:00</updated>
    <published>2025-10-29T00:00:00+09:00</published>
    <category term="talks"/>
    <summary>Oral presentation (Award winner) at CBI 2025 introducing HELM-BERT, a transformer for medium-sized peptide property prediction using HELM notation.</summary>
    <content type="html">&lt;h2 id=&quot;overview&quot;&gt;Overview&lt;/h2&gt;

&lt;p&gt;Oral presentation at the CBI Society Annual Meeting 2025 on HELM-BERT, a transformer-based language model designed for medium-sized peptide property prediction using HELM notation. This presentation received the &lt;strong&gt;Oral Presentation Award&lt;/strong&gt;.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Session:&lt;/strong&gt; O03-02&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Authors:&lt;/strong&gt; Seungeon Lee, Takuto Koyama, Itsuki Maeda, Atsuyuki Matsumoto, Yasushi Okuno&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Affiliation:&lt;/strong&gt; Kyoto University Graduate School of Medicine, Department of Biomedical Data Intelligence&lt;/p&gt;
</content>
  </entry>
  
  <entry>
    <title>HELM-BERT: Peptide Language Model</title>
    <link href="https://flansma.github.io/projects/helm-bert/"/>
    <id>https://flansma.github.io/projects/helm-bert/</id>
    <updated>2025-01-01T00:00:00+09:00</updated>
    <published>2025-01-01T00:00:00+09:00</published>
    <category term="projects"/>
    <summary>Transformer-based language model for medium-sized peptide property prediction using HELM notation.</summary>
    <content type="html">&lt;h2 id=&quot;overview&quot;&gt;Overview&lt;/h2&gt;

&lt;p&gt;HELM-BERT is a transformer-based language model designed specifically for medium-sized peptides represented in HELM (Hierarchical Editing Language for Macromolecules) notation.&lt;/p&gt;

&lt;h2 id=&quot;key-features&quot;&gt;Key Features&lt;/h2&gt;

&lt;ul&gt;
  &lt;li&gt;Pre-trained on large-scale peptide databases&lt;/li&gt;
  &lt;li&gt;Fine-tunable for various property prediction tasks&lt;/li&gt;
  &lt;li&gt;Supports non-natural amino acids and modifications&lt;/li&gt;
&lt;/ul&gt;

&lt;h2 id=&quot;results&quot;&gt;Results&lt;/h2&gt;

&lt;p&gt;The model achieves state-of-the-art performance on peptide property prediction benchmarks.&lt;/p&gt;
</content>
  </entry>
  
</feed>
