
In a model with >=30k neurons total.
For any general programming language, but python is in title for ease of understanding question.
🏅 Top traders
# | Name | Total profit |
---|---|---|
1 | Ṁ161 | |
2 | Ṁ124 | |
3 | Ṁ36 | |
4 | Ṁ22 | |
5 | Ṁ15 |
@AnishaZaveri what do you mean? Gpt-4 has good indentation when it outputs python code, doesn't it? Where is "knowledge" inside the models?
@firstuserhere Sorry got confused. I thought it meant will someone design a transformer model to predict indentation in Python code
@AnishaZaveri no worries. But actually that could be a good experiment too. Instead of next token prediction, try to train it on next indentation prediction. Try to see what separates normal English text spaces and indentation in code. What neurons fire specifically for code, trying to identify that circuit, see if it happens similarly in a NLP transformer