Skip to content

Commit

Permalink
Phd updates
Browse files Browse the repository at this point in the history
  • Loading branch information
BaleChen committed Jun 14, 2024
1 parent 0283fd7 commit 84a4fc9
Show file tree
Hide file tree
Showing 6 changed files with 7 additions and 8 deletions.
4 changes: 2 additions & 2 deletions _config.yml
Original file line number Diff line number Diff line change
Expand Up @@ -83,10 +83,10 @@ author:
name : "Bale Chen"
avatar : "avatar.png"
cn_name : "Also: 陈丙森 (Chen Bing Sen)"
bio : "PhD student at NYU \n NLP Researcher"
bio : "PhD student at NYU
NLP Researcher"
location : "Shanghai, China"
employer :
# pubmed : "https://www.ncbi.nlm.nih.gov/pubmed/?term=john+snow"
# googlescholar : "http://yourfullgooglescholarurl.com"
email : "[email protected]"
researchgate : # example: "https://www.researchgate.net/profile/yourprofile"
Expand Down
2 changes: 1 addition & 1 deletion _data/navigation.yml
Original file line number Diff line number Diff line change
Expand Up @@ -16,7 +16,7 @@ main:
# url: /year-archive/

- title: "CV"
url: /files/CV_Bale_Chen_1215.pdf
url: /files/Bale_Chen_Simplified_PhD_CV_Jun2024.pdf

- title: "More"
url: /more/
Expand Down
5 changes: 3 additions & 2 deletions _pages/about.md
Original file line number Diff line number Diff line change
Expand Up @@ -8,10 +8,11 @@ redirect_from:
- /about.html
---

Hi! I am an incoming CS PhD student at [NYU Courant](https://cims.nyu.edu/dynamic/) and [NYU Shanghai](https://shanghai.nyu.edu), mentored by Professor [Chen Zhao](http://www.chenz.umiacs.io/). I received my B.S. degree in Data Science (concentration in Artificial Intelligence) from NYU Shanghai, where I was supervised by Professor [Wilson Tam](https://shanghai.nyu.edu/academics/faculty/directory/yik-cheung-wilson-tam). My research interests are in **Retrieval-Augmented Langauge Models** and **Controllable Text Generation**.
Hi! I am an incoming CS PhD student at [NYU Courant](https://cims.nyu.edu/dynamic/) and [NYU Shanghai](https://shanghai.nyu.edu), mentored by Professor [Chen Zhao](http://www.chenz.umiacs.io/). Previously, I received my B.S. degree in Data Science (concentration in Artificial Intelligence) from NYU Shanghai, where I was supervised by Professor [Wilson Tam](https://shanghai.nyu.edu/academics/faculty/directory/yik-cheung-wilson-tam). My research interests are in **Retrieval-Augmented Langauge Models** and **Controllable Text Generation**.

News
======

* **[March 2024]** I am attending NAACL 2024 to present our work on [](https://arxiv.org/abs/2405.17893) in collaboration with Xiaocheng Yang and Professor Wilson Tam!
* **[March 2024]** I am attending NAACL 2024 to present our work [ProPer](https://arxiv.org/abs/2405.17893) in collaboration with Xiaocheng Yang and Professor Wilson Tam!


4 changes: 1 addition & 3 deletions _publications/20240313-prolog.md
Original file line number Diff line number Diff line change
Expand Up @@ -2,9 +2,7 @@
title: "Arithmetic Reasoning with LLM: Prolog Generation & Permutation"
collection: publications
permalink: /publication/20240313-prolog
excerpt: '<p> <a href="https://arxiv.org/abs/2405.17893" style="color:#51ADC8;">Paper</a> <a href="https://github.com/yxc-cyber/ProPer" style="color:#51ADC8;">Code</a><br />Xiaocheng Yang, <b>Bingsen Chen</b>, Yik-Cheng Tam</p>'
date: 2024-03-13
venue: 'NAACL 2024'
excerpt: '<p> [<a href="https://arxiv.org/abs/2405.17893" style="color:#51ADC8;">Paper</a>] [<a href="https://github.com/yxc-cyber/ProPer" style="color:#51ADC8;">Code</a>]<br />[NAACL 2024] Xiaocheng Yang, <b>Bingsen Chen</b>, Yik-Cheng Tam</p>'
---

Instructing large language models (LLMs) to solve elementary school math problems has shown great success using Chain of Thought (CoT). However, the CoT approach relies on an LLM to generate a sequence of arithmetic calculations which can be prone to cascaded calculation errors. We hypothesize that an LLM should focus on extracting predicates and generating symbolic formulas from the math problem description so that the underlying calculation can be done via an external code interpreter. We investigate using LLM to generate Prolog programs to solve mathematical questions. Experimental results show that our Prolog-based arithmetic problem-solving outperforms CoT generation in the GSM8K benchmark across three distinct LLMs. In addition, given the insensitive ordering of predicates and symbolic formulas in Prolog, we propose to permute the ground truth predicates for more robust LLM training via data augmentation.
Expand Down
Binary file added files/Bale_Chen_Simplified_PhD_CV_Jun2024.pdf
Binary file not shown.
Binary file removed files/CV_Bale_Chen_1215.pdf
Binary file not shown.

0 comments on commit 84a4fc9

Please sign in to comment.