Will a transformer circuit be found for predicting the correct indentation level for a new line in python this year?
28
570Ṁ6684
resolved Jan 1
Resolved
NO

In a model with >=30k neurons total.

For any general programming language, but python is in title for ease of understanding question.

Get
Ṁ1,000
to start trading!

🏅 Top traders

#NameTotal profit
1Ṁ161
2Ṁ124
3Ṁ36
4Ṁ22
5Ṁ15
Sort by:

Is anyone looking for it? This doesn't seem that hard (e.g., compare to looking up the capital of Italy), but it seems unlikely to be a priority for circuit searchers in 2023.

@JacyAnthis yes, @DanValentine mentioned he was looking for it

Why do you need a transformer for this?

@AnishaZaveri what do you mean? Gpt-4 has good indentation when it outputs python code, doesn't it? Where is "knowledge" inside the models?

@firstuserhere Sorry got confused. I thought it meant will someone design a transformer model to predict indentation in Python code

@AnishaZaveri no worries. But actually that could be a good experiment too. Instead of next token prediction, try to train it on next indentation prediction. Try to see what separates normal English text spaces and indentation in code. What neurons fire specifically for code, trying to identify that circuit, see if it happens similarly in a NLP transformer

© Manifold Markets, Inc.Terms + Mana-only TermsPrivacyRules