How to enlarge an UIImage to clear pixels without becoming blurry in iOS?

I need to make an UIImageView supports zoom in/out, and the most important is, when an image is enlarged, it should been seen the clear and large pixels, but not the fuzzy processed effect, like the images below.

It should be like that:

enter image description here

Should not be like that:

enter image description here

My code, doesn't work:

- (void)viewDidLoad {
    [super viewDidLoad];

    self.view.backgroundColor = [UIColor whiteColor];

    imageView = [[ColorPickImageView alloc] initWithImage:self.image];
    imageView.contentMode = UIViewContentModeScaleAspectFit;

    [self addGestureRecognizerToView:imageView];
    [imageView setUserInteractionEnabled:YES];
    [imageView setMultipleTouchEnabled:YES];

    [self.view addSubview:imageView];

    [imageView setTranslatesAutoresizingMaskIntoConstraints:NO];
    NSDictionary *views = NSDictionaryOfVariableBindings(imageView);
    [self.view addConstraints:[NSLayoutConstraint constraintsWithVisualFormat:@"H:|-0-[imageView]-0-|" options:0 metrics:nil views:views]];
    [self.view addConstraints:[NSLayoutConstraint constraintsWithVisualFormat:@"V:|-0-[imageView]-0-|" options:0 metrics:nil views:views]];

- (void)addGestureRecognizerToView:(UIView *)view
    UIPinchGestureRecognizer *pinchGestureRecognizer = [[UIPinchGestureRecognizer alloc] initWithTarget:self action:@selector(pinchView:)];
    [view addGestureRecognizer:pinchGestureRecognizer];

- (void)pinchView:(UIPinchGestureRecognizer *)pinchGestureRecognizer
    UIView *view = pinchGestureRecognizer.view;
    if (pinchGestureRecognizer.state == UIGestureRecognizerStateBegan || pinchGestureRecognizer.state == UIGestureRecognizerStateChanged) {
        view.transform = CGAffineTransformScale(view.transform, pinchGestureRecognizer.scale, pinchGestureRecognizer.scale);
        pinchGestureRecognizer.scale = 1;


OK, I got intrigued, and could not help but setup a quick 15 minute test. The results are in line with the SO answer I already quoted in the comments. The following works fine for me to generate a pixelated image:

- (void)viewDidLoad {
    [super viewDidLoad];

    UIImage *image = [UIImage imageNamed:@"LocationDot"];
    self.imageView.image = image;
    self.imageView.contentMode = UIViewContentModeScaleAspectFit;
    self.imageView.layer.magnificationFilter = kCAFilterNearest;
//  self.imageView.layer.shouldRasterize = YES;
//  self.imageView.transform = CGAffineTransformMakeScale( 4, 4);

This generates:

pixelated image

With the default kCAFilterLinear you get this:

enter image description here

When applying the transform, it also has the desired effect:

enter image description here

The shouldRasterize has no effect, as expected. With a bitmap as content, it only has some effect if you put a transform on the layer and then do composition with another layer. And even then, the documentation says:

If the rasterized bitmap requires scaling during compositing, the filters in the minificationFilter and magnificationFilter properties are applied as needed.