Notice
Recent Posts
Recent Comments
Link
일 | 월 | 화 | 수 | 목 | 금 | 토 |
---|---|---|---|---|---|---|
1 | 2 | 3 | 4 | |||
5 | 6 | 7 | 8 | 9 | 10 | 11 |
12 | 13 | 14 | 15 | 16 | 17 | 18 |
19 | 20 | 21 | 22 | 23 | 24 | 25 |
26 | 27 | 28 | 29 | 30 | 31 |
Tags
- machinelearning
- texttospeech
- Xangle
- 개발자회고
- 놀이동산의슈퍼컴퓨터를작동시켜라
- CrossAngle
- 인하멘토링
- 프로그라피
- 개발자를위한파이썬
- 서구동구예비군훈련장
- 우분투비트확인
- 한빛미디어
- 타코트론
- jaypark.dating
- 봉사활동
- 심플소프트웨어
- 인하대학교
- 서버로그
- 프로그래머스
- 신영준
- 로그남기기
- tacotron
- 쇠막대기
- 2019회고
- 노트북덮개
- 인천남중
- intell
- 결과를얻는법
- 나는리뷰어다
- graphicdriver
Archives
- Today
- Total
jc.jang
Linear Regression 본문
Linear Regression, Hypothesis and cost
guess Hypothesis and calculate the cost.
repeat change Hypothesis (W,b) the way cost is minimized.
actually, study hours (x) -> predict score(y) is one-variable linear regression
but in image processing, the image represents many pixels, matrix.(except image processing, there are many examples using multi-variable)
so we know how to manage multi-variable
to manage multi-variables, we introduce matrix.
x * w = y
Now we just think dimension.
about expression
lecture(theory) : H(x) = wx+b
implementation(tensorflow) : H(x) = wx+b
plus, Gradient descent is useful.
w:= w-dcost/dw
Comments