Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> language models can't make GPU chips

Not yet. I'm sure somebody is taking a hard look already.



I would assume some of them can write Verilog code


Well, I clearly had to try it even though I don't know my verilog from my veri-println. Can someone who does know, rank this?

--

Prompt: verilog code for 4 bit binary adder

[I'm skipping the English description of the code, this is just the code part of free ChatGPT's response, no cherry picking]

  module binary_adder(A, B, Cin, S, Cout);
    input [3:0] A, B;
    input Cin;
    output [3:0] S;
    output Cout;
    
    wire C0, C1, C2;
    
    // Full adder for bit 0
    full_adder FA0(A[0], B[0], Cin, S[0], C0);
    
    // Full adder for bit 1
    full_adder FA1(A[1], B[1], C0, S[1], C1);
    
    // Full adder for bit 2
    full_adder FA2(A[2], B[2], C1, S[2], C2);
    
    // Full adder for bit 3
    full_adder FA3(A[3], B[3], C2, S[3], Cout);
    
  endmodule
  
  module full_adder(A, B, Cin, S, Cout);
    input A, B, Cin;
    output S, Cout;
    
    assign S = A ^ B ^ Cin;
    assign Cout = (A & B) | (A & Cin) | (B & Cin);
    
  endmodule


It seems correct, but verbose. GPT-4 gives the more concise and idiomatic:

  module full_adder(input [3:0] a, input [3:0] b, input cin, output reg [3:0] sum, output reg cout);

  always @(*) begin
      {cout, sum} = a + b + cin;
  end

  endmodule


We found that Codex was pretty bad at Verilog and so we fine-tuned our own models on Verilog code, starting from the SalesForce CodeGen model. The paper will be at DATE: https://arxiv.org/abs/2212.11140

The largest model we trained is here: https://huggingface.co/shailja/fine-tuned-codegen-16B-Verilo...




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: