๋ณธ๋ฌธ ๋ฐ”๋กœ๊ฐ€๊ธฐ

AI/Computer Vision

[Computer Vision] ์ง€์—ญ ํŠน์ง•์  ๊ฒ€์ถœ๊ณผ ๋งค์นญ

728x90
๋ฐ˜์‘ํ˜•
๐Ÿ‘€ ๋ณธ ์˜ˆ์ œ๋Š” Window10์˜ VSCode, Python3.11.0๋กœ ์ž‘์„ฑ๋˜์—ˆ์Šต๋‹ˆ๋‹ค.

 

ํ•ด๋ฆฌ์Šค ์ฝ”๋„ˆ ๊ฒ€์ถœ(Harris Corner Detection)

ํ•ด๋ฆฌ์Šค ์ฝ”๋„ˆ ๊ฒ€์ถœ์€ ์ด๋ฏธ์ง€์—์„œ ์ฝ”๋„ˆ๋ฅผ ๊ฒ€์ถœํ•˜๋Š” ๊ธฐ๋ฒ•์œผ๋กœ, ์ฃผ๋กœ ํŠน์ง•์  ๊ฒ€์ถœ์— ์‚ฌ์šฉ๋œ๋‹ค.

import cv2
import numpy as np

image = cv2.imread("window.jpg")

gray = cv2.cvtColor(image, cv2.COLOR_BGR2GRAY)
gray = np.float32(gray)

# ํ•ด๋ฆฌ์Šค ์ฝ”๋„ˆ ๊ฒ€์ถœ
dst = cv2.cornerHarris(gray, blockSize=2, ksize=3, k=0.04)

# ๊ฒฐ๊ณผ ์ด๋ฏธ์ง€๋ฅผ ๋ณต์‚ฌํ•˜์—ฌ ์ฝ”๋„ˆ ํ‘œ์‹œ
image[dst > 0.01 * dst.max()] = [0, 0, 255]  # ์ฝ”๋„ˆ ๋ถ€๋ถ„์„ ๋นจ๊ฐ„์ƒ‰์œผ๋กœ ํ‘œ์‹œ

cv2.imshow("Harris",image)
cv2.waitKey(0)
cv2.destroyAllWindows()

 

 

FAST ์ฝ”๋„ˆ ๊ฒ€์ถœ

FAST(Features from Accelerated Segment Test) ์ฝ”๋„ˆ ๊ฒ€์ถœ ์•Œ๊ณ ๋ฆฌ์ฆ˜์€ ๋น ๋ฅด๊ณ  ํšจ์œจ์ ์ธ ํŠน์ง•์  ๊ฒ€์ถœ ๊ธฐ๋ฒ•์ด๋‹ค.

 

 

Features from accelerated segment test - Wikipedia

From Wikipedia, the free encyclopedia Features from accelerated segment test (FAST) is a corner detection method, which could be used to extract feature points and later used to track and map objects in many computer vision tasks. The FAST corner detector

en.wikipedia.org

import cv2
import numpy as np

image = cv2.imread("window.jpg")

gray = cv2.cvtColor(image, cv2.COLOR_BGR2GRAY)

# FAST ์ฝ”๋„ˆ ๊ฒ€์ถœ๊ธฐ ์ดˆ๊ธฐํ™”
fast = cv2.FastFeatureDetector_create()

# ์ฝ”๋„ˆ ๊ฒ€์ถœ
keypoints = fast.detect(gray, None)

# ๊ฒ€์ถœ๋œ ํ‚คํฌ์ธํŠธ๋ฅผ ์ด๋ฏธ์ง€์— ํ‘œ์‹œ
image_with_keypoints = cv2.drawKeypoints(image, keypoints, None, (0, 255, 0), cv2.DRAW_MATCHES_FLAGS_DRAW_RICH_KEYPOINTS)

cv2.imshow("FAST",image_with_keypoints)
cv2.waitKey(0)
cv2.destroyAllWindows()

 

 

ํฌ๊ธฐ ๋ถˆ๋ณ€ ํŠน์ง•์  ๊ฒ€์ถœ - SIFT

์ด์ „ ์•Œ๊ณ ๋ฆฌ์ฆ˜์„ ํ†ตํ•œ ์ฝ”๋„ˆ๋Š” ์˜์ƒ์ด ํšŒ์ „ ๋˜์–ด๋„ ์—ฌ์ „ํžˆ ์ฝ”๋„ˆ๋กœ ๊ฒ€์ถœ๋œ๋‹ค. ์ฝ”๋„ˆ๋Š” ํšŒ์ „ ๋ถˆ๋ณ€ ํŠน์ง•์ ์ด๋ผ๊ณ  ํ•  ์ˆ˜ ์žˆ๋‹ค.

 

๊ทธ๋Ÿฌ๋‚˜ ์˜์ƒ์˜ ํฌ๊ธฐ๊ฐ€ ๋ณ€๊ฒฝ๋  ๊ฒฝ์šฐ ๋”์ด์ƒ ์ฝ”๋„ˆ๋กœ ๊ฒ€์ถœ๋˜์ง€ ์•Š์„ ์ˆ˜ ์žˆ๋‹ค.

 

ํฌ๊ธฐ๊ฐ€ ๋ณ€ํ•ด๋„ ์ง€์†์ ์œผ๋กœ ์ฝ”๋„ˆ๋ฅผ ๊ฒ€์ถœํ•˜๋Š” ์•Œ๊ณ ๋ฆฌ์ฆ˜์ด ์—ฐ๊ตฌ ๋˜์—ˆ๋‹ค.

๊ทธ ์ค‘ ๊ฐ€์žฅ ๋Œ€ํ‘œ์  ์•Œ๊ณ ๋ฆฌ์ฆ˜์ด SIFT๋กœ Scale Invariant Feature Transform์˜ ์•ฝ์ž์ด๋‹ค.

 

SIFT๋Š” ์ด๋ฏธ์ง€์˜ ๋‹ค์–‘ํ•œ ์Šค์ผ€์ผ์—์„œ ํŠน์ง•์ ์„ ๊ฒ€์ถœํ•˜๊ธฐ ์œ„ํ•ด Gaussian Blur๋ฅผ ์‚ฌ์šฉํ•˜์—ฌ ์—ฌ๋กœ ์Šค์ผ€์ผ์˜ ์ด๋ฏธ์ง€๋ฅผ ์ƒ์„ฑํ•œ๋‹ค. ์ด๋ฅผ ํ†ตํ•ด ์ด๋ฏธ์ง€์˜ ๋‹ค์–‘ํ•œ ํฌ๊ธฐ์—์„œ ํŠน์ง•์ ์„ ํƒ์ง€ํ•  ์ˆ˜ ์žˆ๋‹ค.

 

๊ฐ ์Šค์ผ€์ผ์—์„œ DoG(Difference of Gaussian) ๋ฐฉ๋ฒ•์„ ์‚ฌ์šฉํ•˜์—ฌ ๊ทน๋Œ€๊ฐ’(์ตœ๋Œ€๊ฐ’)๊ณผ ๊ทน์†Œ๊ฐ’(์ตœ์†Œ๊ฐ’)์„ ์ฐพ์•„ ํŠน์ง•์ ์„ ๊ฒ€์ถœํ•œ๋‹ค. ์ด ๋‹จ๊ณ„์—์„œ ๊ฒ€์ถœ๋œ ํฌ์ธํŠธ๋Š” ์ž ์žฌ์ ์ธ ํŠน์ง•์ ์œผ๋กœ ๊ฐ„์ฃผ๋œ๋‹ค.

 

๊ฒ€์ถœ๋œ ํŠน์ง•์ ์˜ ์œ„์น˜๋ฅผ ์ •๋ฐ€ํ•˜๊ฒŒ ์กฐ์ •ํ•˜๊ณ  ๊ฐ ํŠน์ง•์ ์— ๋Œ€ํ•ด ์ฃผ ๋ฐฉํ–ฅ์„ ์„ค์ •ํ•˜์—ฌ ํšŒ์ „ ๋ถˆ๋ณ€์„ฑ์„ ๋ถ€์—ฌํ•œ๋‹ค.

์ด ๋ฐฉํ–ฅ์€ ์ฃผ๋ณ€ ํ”ฝ์…€์˜ ๊ธฐ์šธ๊ธฐ๋ฅผ ๊ธฐ๋ฐ˜์œผ๋กœ ๊ณ„์‚ฐ๋œ๋‹ค.

 

๊ฐ ํŠน์ง•์ ์˜ ์ฃผ๋ณ€ ํ”ฝ์…€ ์ •๋ณด๋ฅผ ๊ธฐ๋ฐ˜์œผ๋กœ ์„ค๋ช…์ž๋ฅผ ์ƒ์„ฑํ•œ๋‹ค.

์ผ๋ฐ˜์ ์œผ๋กœ 16x16 ํ”ฝ์…€ ์˜์—ญ์„ 4x4 ๊ทธ๋ฆฌ๋“œ๋กœ ๋‚˜๋ˆ„๊ณ , ๊ฐ ๊ทธ๋ฆฌ๋“œ์˜ ๋ฐฉํ–ฅ ํžˆ์Šคํ† ๊ทธ๋žจ์„ ๊ณ„์‚ฐํ•˜์—ฌ ํŠน์ง•์„ ํ‘œํ˜„ํ•œ๋‹ค.

์ด ์„ค๋ช…์ž๋Š” ๊ณ ์œ ํ•œ ํŠน์ง•์ ์˜ "์ง€๋ฌธ"์—ญํ• ์„ ํ•˜๊ณ  ๋น„๊ต ๋งค์นญ์— ์‚ฌ์šฉ๋œ๋‹ค.

 

๋‹ค๋ฅธ ์ด๋ฏธ์ง€์—์„œ ๊ฒ€์ถœ๋œ SIFT ํŠน์ง•์ ๊ณผ ์„ค๋ช…์ž๋ฅผ ๋น„๊ตํ•˜์—ฌ ๋งค์นญํ•œ๋‹ค. ์ฃผ๋กœ ์œ ํด๋ฆฌ๋“œ ๊ฑฐ๋ฆฌ ๋˜๋Š” ์œ ์‚ฌ๋„ ๊ธฐ๋ฐ˜์œผ๋กœ ๋งค์นญ์„ ์ˆ˜ํ–‰ํ•œ๋‹ค.

 

ํฌ๊ธฐ ๋ถˆ๋ณ€ ํŠน์ง•์  ๊ฒ€์ถœ - SURF

SIFT๋Š” ๊ณ„์‚ฐ ๋น„์šฉ์ด ๋†’๋‹ค. ์ด๋ฅผ ๊ฐœ์„ ๋˜์–ด ๋‚˜์˜จ ์•Œ๊ณ ๋ฆฌ์ฆ˜์ด SURF(Speeded Up Robust Features)์ด๋‹ค.

 

SURF๋Š” Gaussian ํ•„ํ„ฐ ๋Œ€์‹  Haar ์›จ์ด๋ธŒ๋žซ์„ ์‚ฌ์šฉํ•˜์—ฌ ์Šค์ผ€์ผ ๊ณต๊ฐ„์„ ์ƒ์„ฑํ•œ๋‹ค.

์ด๋ฅผ ํ†ตํ•ด ์†๋„๊ฐ€ ํ–ฅ์ƒ๋œ๋‹ค.

 

์ดํ›„ ์ด๋ฏธ์ง€์˜ ๊ฐ ์Šค์ผ€์ผ์—์„œ Hessian ํ–‰๋ ฌ์„ ์‚ฌ์šฉํ•˜์—ฌ ํŠน์ง•์ ์„ ๊ฒ€์ถœํ•œ๋‹ค.

 

ํŠน์ง•์ ์— ๋Œ€ํ•ด ์ฃผ๋ฐฉํ–ฅ์„ ๊ณ„์‚ฐํ•˜์—ฌ ํšŒ์ „ ๋ถˆ๋ณ€์„ฑ์„ ์ œ๊ณตํ•œ๋‹ค.

SIFT์™€ ๋™์ผํ•˜๊ฒŒ ๋ฐฉํ–ฅ์€ ์ฃผ๋ณ€ ํ”ฝ์…€์˜ ๊ธฐ์šธ๊ธฐ๋ฅผ ๊ธฐ๋ฐ˜์œผ๋กœ ๊ณ„์‚ฐ๋œ๋‹ค.

 

SURF์˜ ์„ค๋ช…์ž ์ƒ์„ฑ์€ 64์ฐจ์› ๋˜๋Š” 128์ฐจ์›์˜ ์„ค๋ช…์ž๋ฅผ ์ƒ์„ฑํ•œ๋‹ค.

ํŠน์ง•์  ์ฃผ๋ณ€์˜ Haar ์›จ์ด๋ธŒ๋ › ์‘๋‹ต์„ ๊ธฐ๋ฐ˜์œผ๋กœ ๋ฐฉํ–ฅ ํžˆ์Šคํ† ๊ทธ๋žจ์„ ๊ณ„์‚ฐํ•˜์—ฌ ํŠน์ง•์ ์„ ํ‘œํ˜„ํ•œ๋‹ค.

 

SIFT์—์„œ์™€ ๊ฐ™์ด ๋‹ค๋ฅธ ์ด๋ฏธ์ง€์—์„œ ๊ฒ€์ถœ๋œ ํŠน์ง•์ ๊ณผ ๋น„๊ตํ•˜์—ฌ ๋งค์นญํ•œ๋‹ค. ์œ ํด๋ฆฌ๋“œ ๊ฑฐ๋ฆฌ ๋˜๋Š” ์œ ์‚ฌ๋„ ๊ธฐ๋ฐ˜ ๋ฉ”ํŠธ๋ฆญ์„ ์‚ฌ์šฉํ•œ๋‹ค.

 

ํฌ๊ธฐ ๋ถˆ๋ณ€ ํŠน์ง•์  ๊ฒ€์ถœ - ORB

ORB(Oriented FAST and Rotated BRIEF)๋Š” SIFT์™€ SURF์˜ ๋ณต์žกํ•œ ์—ฐ์‚ฐ์„ ํ•ด๊ฒฐํ•˜๊ธฐ ์œ„ํ•ด ๋Œ€์•ˆ์œผ๋กœ ์ œ์•ˆ๋œ ํŠน์ง•์  ๊ฒ€์ถœ ์•Œ๊ณ ๋ฆฌ์ฆ˜์ด๋‹ค.

 

ORB๋Š” FAST ์•Œ๊ณ ๋ฆฌ์ฆ˜์„ ์‚ฌ์šฉํ•˜์—ฌ ํŠน์ง•์ ์„ ๋น ๋ฅด๊ฒŒ ๊ฒ€์ถœํ•˜๋ฉฐ ์ด๋Š” ์‹ค์‹œ๊ฐ„ ์‘์šฉ์— ์ ํ•ฉํ•œ ์†๋„๋ฅผ ์ œ๊ณตํ•œ๋‹ค.

 

์ฃผ๋ฐฉํ–ฅ ๊ณ„์‚ฐ์„ ํ†ตํ•œ ํšŒ์ „ ๋ถˆ๋ณ€์„ฑ ์ œ๊ณต์€ SIFT, SURF์™€ ๋™์ผํ•˜๋‹ค.

 

BRIEF(Binary Robust Invariant Scalable Keypoints) ์„ค๋ช…์ž๋ฅผ ์‚ฌ์šฉํ•˜์—ฌ ํŠน์ง•์„ ํ‘œํ˜„ํ•œ๋‹ค. 

BRIEF๋Š” ์ด์ง„ ์„ค๋ช…์ž๋กœ, ๋น ๋ฅธ ๋งค์นญ์ด ๊ฐ€๋Šฅํ•˜๋„๋ก ์„ค๊ณ„๋˜์—ˆ๋‹ค.

ORB๋Š” BRIEF๋ฅผ ํšŒ์ „ ๊ฐ€๋Šฅํ•˜๊ฒŒ ๋งŒ๋“ค์–ด, ์„ค๋ช…์ž๊ฐ€ ํšŒ์ „์— ๊ฐ•ํ•˜๊ฒŒ ๋œ๋‹ค.

 

ORB์˜ ํŠน์ง•์ ์€ ๋‹ค๋ฅธ ์ด๋ฏธ์ง€์˜ ํŠน์ง•์ ๊ณผ ์ด์ง„ ์„ค๋ช…์ž๋ฅผ ๋น„๊ตํ•˜์—ฌ ๋งค์นญ๋˜๊ณ  ์ผ๋ฐ˜์ ์œผ๋กœ ํ•ด๋ฐ ๊ฑฐ๋ฆฌ(Hamming Distance)๋ฅผ ์‚ฌ์šฉํ•˜์—ฌ ๋งค์นญ์˜ ์œ ์‚ฌ์„ฑ์„ ๊ณ„์‚ฐํ•œ๋‹ค.

import cv2
import numpy as np

# ์ด๋ฏธ์ง€ ์ฝ๊ธฐ
image = cv2.imread("window.jpg")

# ORB ๊ฐ์ฒด ์ƒ์„ฑ
orb = cv2.ORB_create()

# ํ‚ค ํฌ์ธํŠธ ๋ฐ ์„ค๋ช…์ž ๊ฒ€์ถœ
keypoints, descriptors = orb.detectAndCompute(image, None)

# ํ‚ค ํฌ์ธํŠธ๋ฅผ ์ด๋ฏธ์ง€์— ๊ทธ๋ฆฌ๊ธฐ
image_with_keypoints = image.copy()

# ๋žœ๋ค ์ƒ‰์ƒ์œผ๋กœ ํ‚ค ํฌ์ธํŠธ ๊ทธ๋ฆฌ๊ธฐ
for kp in keypoints:
    # ๋žœ๋ค ์ƒ‰์ƒ ์ƒ์„ฑ (BGR ํ˜•์‹)
    random_color = np.random.randint(0, 256, size=3).tolist()
    # ํ‚ค ํฌ์ธํŠธ ์œ„์น˜์™€ ํฌ๊ธฐ์— ๋”ฐ๋ผ ์› ๊ทธ๋ฆฌ๊ธฐ
    cv2.circle(image_with_keypoints, (int(kp.pt[0]), int(kp.pt[1])), int(kp.size / 2), random_color, 1)

# ๊ฒฐ๊ณผ ์ด๋ฏธ์ง€ ํ‘œ์‹œ
cv2.imshow('Keypoints with Random Colors', image_with_keypoints)
cv2.waitKey(0)
cv2.destroyAllWindows()

 

 

 

Homography

ํ‚คํฌ์ธํŠธ ๋งค์นญ์„ ํ•˜๊ณ  ํ˜ธ๋ชจ๊ทธ๋ž˜ํ”ผ๋ฅผ ๊ณ„์‚ฐํ•˜์—ฌ ๊ฐ์ฒด ๊ฒ€์ถœ์„ ํ•  ์ˆ˜ ์žˆ๋‹ค.

ํ˜ธ๋ชจ๊ทธ๋ž˜ํ”ผ(Homography)๋Š” 3์ฐจ์› ๊ณต๊ฐ„์ƒ์˜ ํ‰๋ฉด์„ ์„œ๋กœ ๋‹ค๋ฅธ ์‹œ์ ์—์„œ ๋ฐ”๋ผ๋ดค์„ ๋•Œ ํš๋“๋˜๋Š” ์˜์ƒ ์‚ฌ์ด์˜ ๊ด€๊ณ„๋ฅผ ๋‚˜ํƒ€๋‚ด๋Š” ์šฉ์–ด์ด๋‹ค.

 

import cv2
import sys
import numpy as np

# ์ด๋ฏธ์ง€ ์ฝ๊ธฐ
src1 = cv2.imread('cat_head.png', cv2.IMREAD_GRAYSCALE)
src1 = cv2.resize(src1,(int(src1.shape[1]//2),int(src1.shape[0]//2)),interpolation=cv2.INTER_LANCZOS4)
src2 = cv2.imread('cat.png', cv2.IMREAD_GRAYSCALE)
src2 = cv2.resize(src2,(int(src2.shape[1]//2),int(src2.shape[0]//2)),interpolation=cv2.INTER_LANCZOS4)

if src1 is None or src2 is None:
    print('Image load failed!')
    sys.exit()

# ํŠน์ง•์  ์•Œ๊ณ ๋ฆฌ์ฆ˜ ๊ฐ์ฒด ์ƒ์„ฑ (KAZE, AKAZE, ORB ๋“ฑ)
feature = cv2.KAZE_create()  # ๊ธฐ๋ณธ๊ฐ’์ธ L2๋†ˆ ์ด์šฉ
# feature = cv2.AKAZE_create()
# feature = cv2.ORB_create()

# ํŠน์ง•์  ๊ฒ€์ถœ ๋ฐ ๊ธฐ์ˆ ์ž ๊ณ„์‚ฐ
kp1, desc1 = feature.detectAndCompute(src1, None)
kp2, desc2 = feature.detectAndCompute(src2, None)

# ํŠน์ง•์  ๋งค์นญ
matcher = cv2.BFMatcher_create()
matches = matcher.match(desc1, desc2)

# ์ข‹์€ ๋งค์นญ ๊ฒฐ๊ณผ ์„ ๋ณ„
matches = sorted(matches, key=lambda x: x.distance)
good_matches = matches[:80]

print('# of kp1:', len(kp1))
print('# of kp2:', len(kp2))
print('# of matches:', len(matches))
print('# of good_matches:', len(good_matches))

# ํ˜ธ๋ชจ๊ทธ๋ž˜ํ”ผ ๊ณ„์‚ฐ
pts1 = np.array([kp1[m.queryIdx].pt for m in good_matches]).reshape(-1, 1, 2).astype(np.float32)
pts2 = np.array([kp2[m.trainIdx].pt for m in good_matches]).reshape(-1, 1, 2).astype(np.float32)  # ์ˆ˜์ •๋œ ๋ถ€๋ถ„

H, _ = cv2.findHomography(pts1, pts2, cv2.RANSAC)  # pts1๊ณผ pts2์˜ ํ–‰๋ ฌ ์ฃผ์˜ (N,1,2)

# ํ˜ธ๋ชจ๊ทธ๋ž˜ํ”ผ๋ฅผ ์ด์šฉํ•˜์—ฌ ๊ธฐ์ค€ ์˜์ƒ ์˜์—ญ ํ‘œ์‹œ
dst = cv2.drawMatches(src1, kp1, src2, kp2, good_matches, None, flags=cv2.DrawMatchesFlags_NOT_DRAW_SINGLE_POINTS)

(h, w) = src1.shape[:2]

# ์ž…๋ ฅ ์˜์ƒ์˜ ๋ชจ์„œ๋ฆฌ 4์  ์ขŒํ‘œ
corners1 = np.array([[0, 0], [0, h-1], [w-1, h-1], [w-1, 0]]).reshape(-1, 1, 2).astype(np.float32)

# ์ž…๋ ฅ ์˜์ƒ์— ํ˜ธ๋ชจ๊ทธ๋ž˜ํ”ผ H ํ–‰๋ ฌ๋กœ ํˆฌ์‹œ ๋ณ€ํ™˜
corners2 = cv2.perspectiveTransform(corners1, H)

# corners2๋Š” ์ž…๋ ฅ ์˜์ƒ์— ์ขŒํ‘œ๊ฐ€ ํ‘œํ˜„๋˜์žˆ์œผ๋ฏ€๋กœ ์ž…๋ ฅ์˜์ƒ์˜ ๋„“์ด ๋งŒํผ ์‰ฌํ”„ํŠธ
corners2 = corners2 + np.float32([w, 0])

# ๋‹ค๊ฐํ˜• ๊ทธ๋ฆฌ๊ธฐ
cv2.polylines(dst, [np.int32(corners2)], True, (0, 255, 0), 2, cv2.LINE_AA)

cv2.imshow('Homography', dst)
cv2.waitKey(0)
cv2.destroyAllWindows()

 

 

 

์˜์ƒ ์ด์–ด ๋ถ™์ด๊ธฐ(Image Stitching)

์˜์ƒ ์ด์–ด ๋ถ™์ด๊ธฐ๋Š” ์—ฌ๋Ÿฌ ์žฅ์˜ ์˜์ƒ์„ ์„œ๋กœ ์ด์–ด ๋ถ™์—ฌ์„œ ํ•˜๋‚˜์˜ ํฐ ์˜์ƒ์„ ๋งŒ๋“œ๋Š” ๊ธฐ๋ฒ•์ด๋‹ค.

์ด๋ ‡๊ฒŒ ๋งŒ๋“ค์–ด์ง„ ์˜์ƒ์„ ํŒŒ๋…ธ๋ผ๋งˆ ์˜์ƒ(Panorama Image)๋ผ๊ณ  ํ•œ๋‹ค.

import cv2

# ์ด๋ฏธ์ง€ ์ฝ๊ธฐ
img1 = cv2.imread('cat1.png')
img1 = cv2.resize(img1,(int(img1.shape[1]//2),int(img1.shape[0]//2)),interpolation=cv2.INTER_LANCZOS4)
img2 = cv2.imread('cat2.png')
img2 = cv2.resize(img2,(int(img2.shape[1]//2),int(img2.shape[0]//2)),interpolation=cv2.INTER_LANCZOS4)
img3 = cv2.imread('cat3.png')
img3 = cv2.resize(img3,(int(img3.shape[1]//2),int(img3.shape[0]//2)),interpolation=cv2.INTER_LANCZOS4)

img_list = [img1,img2,img3]

stitcher = cv2.Stitcher.create()

status,stitched = stitcher.stitch(img_list)
# ๊ฒฐ๊ณผ ์ด๋ฏธ์ง€ ์ถœ๋ ฅ
cv2.imshow("img1",img1)
cv2.imshow("img2",img2)
cv2.imshow("img3",img3)
cv2.imshow("Stiched Image",stitched)
cv2.waitKey(0)
cv2.destroyAllWindows()

728x90
๋ฐ˜์‘ํ˜•