I have two images of different sizes that I want to composite together using a CIFilter. ImageA size is 2400x1800. ImageB size is 1200x900.
As the two images are different sizes, when the images are composited together, ImageB is positioned at the bottom left and quarter the size of ImageA.
This does make sense, but it is not what I intended. I would like ImageB to resize to fit the same full size as ImageA.
I use an extension to resize ImageB but the performance is very slow and doesn’t seem to have any effect.
Questions
How do I efficiently resize ImageB to fit the same full size as ImageA, keeping both images center aligned, and maintaining the aspect ratio?
Does CIFilter have an inbuilt option to resize images before applying a filter?
Note: As a side thought, I considered an unconventional approach using an UIImageView with the size of ImageA, loading in ImageB, and then taking a snapshot. That would seem to guarantee the correct size and aspect with good performance.
Code
// Resize image extension
extension UIImage {
func resized(to size: CGSize) -> UIImage {
return UIGraphicsImageRenderer(size: size).image { _ in
draw(in: CGRect(origin: .zero, size: size))
}
}
}
// Resize image usage
let ImageA = UIImage()
let ImageB = UIImage()
let resizedImage = ImageB.resized(to: CGSize(width: ImageA.size.width, height: ImageA.size.height))
ImageB = resizedImage
// Composite filter
let addCIFilter = CIFilter(name: "CIColorDodgeBlendMode")!
addCIFilter.setValue(CIImage(image: ImageA), forKey: kCIInputImageKey)
addCIFilter.setValue(CIImage(image: ImageB), forKey: kCIInputBackgroundImageKey)
let outputImage = addCIFilter.outputImage