Profile
Big Talk Wisdom Manufacturing #1 Hammer and Scene "What are you afraid of, just tell me!" the scientist said confidently, patted his chest, "my model has billions of parameters at random, what can't be seen? There is nothing to be afraid of," as everyone knows, this is exactly the problem where. In the laboratory, there are data openings, not to mention positive and negative data, tens of thousands of categories are manually marked in imageNet, plus transfer learning and fine-tune, it is really easy to establish the most basic demonstration cases, and also It always dazzles those who are new to the neural network industry, and feel that they have seen the long-lost dawn. But when it comes to the actual combat scene, it is not the same thing. "Give me complete defect data, and I'll give you the whole world," the scientist said confidently when he arrived at the scene. "You think too much, I don't have anything," the industry person said lightly, "I only have OK data, don't make it too complicated, just pick out the different ones, it's that simple, AI should be very powerful , can it be done in ten minutes? Bye, I'm going to get the order." When you come back after three months, you will see two sulking people, relatively speechless and tearful, "Tell me, how can I explain to the boss?" "There is no data, you want me to be a ghost? You are cruel, you Ruthless, you are making trouble for no reason!" The following is repeated 65,536 times, and the table will not be pressed for the time being. The industry does not hide the data, and there is almost no defect data on site, at least not as much as the scientists want. "Our yield rate is above 98%, otherwise we would have collapsed," the industry person said with a bitter face. As far as the electronic foundry industry is concerned, every factory is trying its best to lower the price of raw materials, because the god damn customer has threatened to reduce the price by 5% this year, otherwise the order will be transferred, "Also, I only accept good products, tweet me, I love you," the big customer smiled innocently. The more defects, the higher the cost, so Taiwanese factories have long been able to achieve low cost and high yield, but it is precisely this that makes scientists hit the wall. The school laboratory focuses on SOTA (the best and latest), and the indicator of strength is how many papers are published in which journals. The models are also bigger than each other. Take the convolutional network as an example. In 1998, LeNet had more than 40,000 parameters (44,426), which was less than 20 years old. In 2016, ResNet had more than 20 million parameters. up 600 times. With such a parameter scale, in Taiwan, each person can get more than one on average. This was still 6 years ago. OpenAI announced in 2020 that GPT-3 has 175 billion parameters, and it is more than 7000 times larger than resnet. Although GPT-3 and resnet deal with different objects, it can still be seen that the growth rate of the model scale has been like Jack's pea, which can grow to the sky overnight.
Forum Role: Participant
Topics Started: 0
Replies Created: 0