Caroline Lemieux

Caroline Lemieux

@cestlemieux.bsky.social

now: Assistant Professing™ in Software Practices Lab at UBC. was: postdoc MSR NYC, phd UC Berkeley. also at https://mastodon.acm.org/@cestlemieux. she/her.

392 Followers 99 Following 12 Posts Joined Nov 2024
4 months ago
Grad admission virtual info session. Wed Dec 3, 2025. Session 1: 8:00AM - 9:30 AM PT; Session 2 5:00PM - 6:30PM PT. Register Now!

Curious about graduate programs at UBC Computer Science? Join our virtual Q&A sessions about admissions to our MSc, PhD-track and PhD program on Wed Dec 3! Register here: www.cs.ubc.ca/graduate-pro...

1 2 0 0
4 months ago

Thank you!

0 0 0 0
4 months ago

I am extremely honoured to be appointed as a new Canada Research Chair, joining many esteemed colleagues at UBC and in the country at large!

4 0 1 0
4 months ago

We are recruiting for two assistant professor positions, with priority areas in: visualization; robotics & reinforcement learning; data mangagement & data mining. Applications due December 10th! See more at the link below.

2 0 0 0
5 months ago
Preview
UBC Computer Science makes waves at programming language conference ICFP/SPLASH

The UBC Software Practices Lab is heading to #icfpsplash25! 4 ICFP/OOPSLA talks, 1 SPLASH-E, 5 talks at associated workshops... check it out: www.cs.ubc.ca/news/2025/10...

7 1 0 0
8 months ago
Preview
Release Pynguin 0.41.0 · se2p/pynguin Fix subject_properties aren't registered when running Pynguin on an imported module Update documentation (Codestyle, Code Overview) Add LLM-Agent guidelines Add PynguinML mode: Parsing and test gen...

CodaMOSA was built on Pynguin version 0.18 .... since then, Pynguin has been very much improved. Pynguin version 0.41.0 includes the LLMMOSA Algorithm, which integrates core parts of CodaMOSA into modern Pynguin! Many kudos to Pynguin contributors for this. github.com/se2p/pynguin...

5 0 0 0
9 months ago

A common follow-up, whenever we bring up parametric generators is "but can't those byte-level mutations result in a totally different generated input?"

....they do, but our experiments found those more destructive mutations generally lead to higher coverage. Our students fully explored this in:

4 0 0 0
9 months ago

after years of fuzzing libxml2 I am happy to announce I have now actually used xmllint

3 0 0 0
9 months ago

I was today days old when I learned that IEEETran format has special figure captions overwritten by \usepackage{subcaption}; instead one should use \usepackage[caption=false]{subfig}. tex.stackexchange.com/questions/30...

2 0 0 0
10 months ago
Post image

Back from #ICSE25! I'm looking forward to reading "No Harness, No Problem: Oracle-guided Harnessing for Auto-generating C API Fuzzing Harnesses", by @gabriel-sherman.bsky.social and @snagycs.bsky.social.(users.cs.utah.edu/~snagy/paper...). Nice progress in fuzz driver generation!

9 0 0 0
11 months ago
under the header "projects", a logo consisting of a black shadow of a duck on the left hand side of the words ": QuAC". Where the duck and the text ": QuAC" intersect, the shadow is whited-out.

Finally added a little logo for QuAC (our attribute-based python type inference tool, paper: doi.org/10.1145/3689... + repo: github.com/jifengwu2k/q...) to my website :).

8 1 0 0
11 months ago
FUZZING'25 Workshop @ ISSTA The 4th International Fuzzing Workshop (FUZZING) 2025 welcomes all researchers, scientists, engineers and practitioners to present their latest research findings, empirical analyses, t...

There's still time to submit to FUZZING'25! This year, we're accepting both the (now classic) registered reports _and_ new short papers (fuzzing nuggets). Deadline is now March 26th! fuzzingworkshop.github.io

10 6 2 0
1 year ago

Was already on my reading list, gives me more incentive to read it :P

2 0 0 0
1 year ago
Post image

🧠 Older models memorize more: Models like CodeGen, and CodeLlama show significantly higher leakage on Defects4J than newer models (e.g., Llama 3.1). They often reproduce patches verbatim, to the point that it’s weird (including comments!!) 🔥 3/

5 1 1 0