文章目錄
- 單目相機(jī)標(biāo)定(基于Python OpenCV)
-
- 1.上期填坑
- 2.單目相機(jī)標(biāo)定
-
- 2.1 數(shù)據(jù)采集
- 2.2 角點(diǎn)提取
- 2.3 參數(shù)求解
- 2.4 參數(shù)評(píng)估(重投影誤差)
- 2.5 相機(jī)位姿(棋盤位姿)可視化
- 2.6 同Matlab標(biāo)定結(jié)果比較
單目相機(jī)標(biāo)定(基于Python OpenCV)
1.上期填坑
在開始本篇博客之前,先填一下上一篇博客【計(jì)算機(jī)視覺】基于ORB角點(diǎn)+RANSAC算法實(shí)現(xiàn)圖像全景拼接的坑(不算填吧,算做一個(gè)記錄,因?yàn)椴]有解決問題,留著看以后有沒有空解決??),不想看的可以直接跳到下一節(jié)。
首先解決如何拼接多張圖像(上篇博客只能拼接兩張圖像,多張圖像需要保存兩張圖像的匹配結(jié)果,再重新讀取計(jì)算角點(diǎn))
改進(jìn)的方法的基本思路是,僅計(jì)算當(dāng)前兩張圖像的單應(yīng)變換M_current,通過先前的單應(yīng)矩陣M_before再計(jì)算一個(gè)累積變換,最終通過矩陣乘法得到的M的變換就延續(xù)了當(dāng)前變換的累積變換矩陣:
# 一開始無先前變換,因此設(shè)置為單位陣
M_before = np.eye(3,3)
result = cv2.imread('datas/1.jpg')
# result = CalcUndistort(result, mtx, dist)
result,_,_ = utils.auto_reshape(result, 1080)
img2 = result
cors2, desc2= extraORBfromImg(orb, img2)
for i in range(1,6):
print(i)
img1 = cv2.imread('datas/'+str(i+1)+'.jpg')
# img1 = CalcUndistort(img1, mtx, dist)
img1,_,_ = utils.auto_reshape(img1, 1080)
cors1, desc1= extraORBfromImg(orb, img1)
match_dist, match_idx = ORBMatch(match, desc1, desc2)
# 得到匹配點(diǎn)對(duì)的坐標(biāo)
match_pts = findMatchCord(match_idx, cors1, cors2)
# 可視化匹配點(diǎn)
# utils.drawMatches(img1, img2, cors1, cors2, match_idx)
# RANSAC迭代去除異常點(diǎn)對(duì)
update_match_pts = RANSAC(match_pts)
# 最小二乘方法計(jì)算單應(yīng)矩陣
M = calc_homography(update_match_pts)
# 圖像拼接結(jié)果可視化, 并傳遞累積單應(yīng)變換矩陣M ,返回先前累積拼接結(jié)果result
result, M = homography_trans(M, M_before, img1, result)
M_before = M
# 不用再提取一遍:
img2 = img1
cors2, desc2 = cors1, desc1
值得注意的是,代碼采用的匹配順序是從右至左,為了保證每次只提取最左側(cè)圖像的角點(diǎn),img1對(duì)于圖像的拍攝時(shí)序應(yīng)該要在img2的左側(cè)。
相應(yīng)的,函數(shù)homography_trans
也要做修改:
# 可視化圖像對(duì)映射效果
def homography_trans(M, M_before, img1, img2):
M = M_before @ M
# out_img 第一張圖像映射到第二張
x_min, x_max, y_min, y_max, M2 = calc_border(M, img1.shape)
# 透視變換+平移變換(使得圖像在正中央)
M = M2 @ M
w, h = int(round(x_max)-round(x_min)), int(round(y_max)-round(y_min))
out_img = cv2.warpPerspective(img1, M, (w, h))
# print(out_img.shape)
# cv2.imshow('ww',out_img)
# cv2.waitKey(0)
# 調(diào)整兩張圖像位姿一致:
# x方向
out_img_blank_x = np.zeros((out_img.shape[0], abs(round(x_min)), 3)).astype(np.uint8)
img2_blank_x = np.zeros((img2.shape[0], abs(round(x_min)), 3)).astype(np.uint8)
if(x_min>0):
# print(1)
out_img = cv2.hconcat((out_img_blank_x, out_img))
if(x_min<0):
# print(2)
img2 = cv2.hconcat((img2_blank_x, img2))
# y方向
out_img_blank_y = np.zeros((abs(round(y_min)), out_img.shape[1], 3)).astype(np.uint8)
img2_blank_y = np.zeros((abs(round(y_min)), img2.shape[1], 3)).astype(np.uint8)
if(y_min>0):
# print(3)
out_img = cv2.vconcat((out_img, out_img_blank_y))
if(y_min<0):
# print(4)
img2 = cv2.vconcat((img2_blank_y, img2))
# 調(diào)整兩張圖像尺度一致(邊緣填充):
if(img2.shape[0]<out_img.shape[0]):
blank_y = np.zeros((out_img.shape[0]-img2.shape[0], img2.shape[1], 3)).astype(np.uint8)
img2 = cv2.vconcat((img2, blank_y))
else:
blank_y = np.zeros((img2.shape[0]-out_img.shape[0], out_img.shape[1], 3)).astype(np.uint8)
out_img = cv2.vconcat((out_img, blank_y))
if(img2.shape[1]<out_img.shape[1]):
blank_x = np.zeros((img2.shape[0], out_img.shape[1]-img2.shape[1], 3)).astype(np.uint8)
img2 = cv2.hconcat((img2, blank_x))
else:
blank_x = np.zeros((out_img.shape[0], img2.shape[1]-out_img.shape[1], 3)).astype(np.uint8)
out_img = cv2.hconcat((out_img, blank_x))
# cv2.imwrite('out_img.jpg',out_img)
# 疊加
result = addMatches(out_img, img2)
# 圖像背景白
mask = 255*np.ones(result.shape).astype(np.uint8)
gray_res = cv2.cvtColor(result, cv2.COLOR_BGR2GRAY)
mask[gray_res==0]=0
cv2.imwrite('mask.jpg',mask)
# result[result==0]=255
cv2.imwrite('result.jpg',result)
return result, M
重點(diǎn)是改了這里(實(shí)現(xiàn)單應(yīng)變換的傳遞):
... ...
M = M_before @ M
... ...
M = M2 @ M
(Ps:如果從左至右拼接,好像有點(diǎn)問題…)文章來源:http://www.zghlxwxcb.cn/news/detail-416825.html
然后另一個(gè)問題是:往后的拼接圖像計(jì)算出的單應(yīng)矩陣就會(huì)越扭曲:整一個(gè)視覺效果就會(huì)像文章來源地址http://www.zghlxwxcb.cn/news/detail-416825.html
到了這里,關(guān)于【計(jì)算機(jī)視覺】OpenCV實(shí)現(xiàn)單目相機(jī)標(biāo)定的文章就介紹完了。如果您還想了解更多內(nèi)容,請(qǐng)?jiān)谟疑辖撬阉鱐OY模板網(wǎng)以前的文章或繼續(xù)瀏覽下面的相關(guān)文章,希望大家以后多多支持TOY模板網(wǎng)!