iOS图像处于气泡效果中 [英] iOS images in a bubble effect

查看:160
本文介绍了iOS图像处于气泡效果中的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我有一些想要置于泡沫中的图像。气泡在屏幕上浮动,这些图像被困在屏幕内。



最好的方法是将内部图像与气泡图像结合起来,并以某种方式扭曲内部图像,使其看起来像是在气泡内部反射。



有没有人知道如何在不使用纹理和网格的情况下实现此效果?也许有人记得一个旧项目或类似的东西吗?



这是我的意思的一个例子:



解决方案

你可以使用来自我的开源,以解答有关Android上类似情况的问题。基本上,我使用片段着色器来折射穿过假想球体的光,然后我用它来查找包含源图像的纹理。使用简单的高斯模糊背景模糊。



如果要获得所显示图像的精确外观,可能需要调整此片段着色器向球体添加一些放牧角度颜色,但这应该让你相当接近。



为了好玩,我决定尝试更多密切复制上面的玻璃球。我在球体上添加了掠射角度照明和镜面反射光反射,并且没有反射折射的纹理坐标,从而产生以下结果:





我在这个较新版本中使用了以下片段着色器:

 改变highp vec2 textureCoordinate; 

uniform sampler2D inputImageTexture;

统一highp vec2中心;
统一的高浮点半径;
uniform highp float aspectRatio;
uniform highp float refractiveIndex;
// uniform vec3 lightPosition;
const highp vec3 lightPosition = vec3(-0.5,0.5,1.0);
const highp vec3 ambientLightPosition = vec3(0.0,0.0,1.0);

void main()
{
highp vec2 textureCoordinateToUse = vec2(textureCoordinate.x,(textureCoordinate.y * aspectRatio + 0.5 - 0.5 * aspectRatio));
highp float distanceFromCenter = distance(center,textureCoordinateToUse);
lowp float checkForPresenceWithinSphere = step(distanceFromCenter,radius);

distanceFromCenter = distanceFromCenter / radius;

highp float normalizedDepth = radius * sqrt(1.0 - distanceFromCenter * distanceFromCenter);
highp vec3 sphereNormal = normalize(vec3(textureCoordinateToUse - center,normalizedDepth));

highp vec3 refractedVector = 2.0 * refract(vec3(0.0,0.0,-1.0),sphereNormal,refractiveIndex);
refractedVector.xy = -refractedVector.xy;

highp vec3 finalSphereColor = texture2D(inputImageTexture,(refractedVector.xy + 1.0)* 0.5).rgb;

//掠射角度照明
highp float lightingIntensity = 2.5 *(1.0 - pow(clamp(dots(ambientLightPosition,sphereNormal),0.0,1.0),0.25));
finalSphereColor + = lightingIntensity;

//高光照明
lightingIntensity = clamp(dot(normalize(lightPosition),sphereNormal),0.0,1.0);
lightingIntensity = pow(lightingIntensity,15.0);
finalSphereColor + = vec3(0.8,0.8,0.8)* lightingIntensity;

gl_FragColor = vec4(finalSphereColor,1.0)* checkForPresenceWithinSphere;
}

此过滤器可以使用GPUImageGlassSphereFilter运行。


I have some images that I want to "put inside a bubble". The bubbles kind of float around the screen with these images trapped inside them.

The best is a way to combine the inside image with the bubble image and somehow warp the inside image to look like it is reflected on the inside of the bubble.

Does anyone know how to achieve this effect without using textures and meshes? Perhaps someone remembers an old project or something that did something similar?

Here is an example of what I mean:

解决方案

You can do this using the GPUImageSphereRefractionFilter from my open source GPUImage framework:

I describe in detail how this works in this answer to a question about a similar affect on Android. Basically, I use a fragment shader to refract the light that passes through an imaginary sphere, then I use that to do a lookup into a texture containing the source image. The background is blurred using a simple Gaussian blur.

If you want to achieve the exact look of the image you show, you might need to tweak this fragment shader to add some grazing-angle color to the sphere, but this should get you fairly close.

For the fun of it, I decided to try to more closely replicate the glass sphere above. I added grazing angle lighting and a specular lighting reflection on the sphere, as well as not inverting the refracted texture coordinates, leading to this result:

I used the following fragment shader for this newer version:

 varying highp vec2 textureCoordinate;

 uniform sampler2D inputImageTexture;

 uniform highp vec2 center;
 uniform highp float radius;
 uniform highp float aspectRatio;
 uniform highp float refractiveIndex;
// uniform vec3 lightPosition;
 const highp vec3 lightPosition = vec3(-0.5, 0.5, 1.0);
 const highp vec3 ambientLightPosition = vec3(0.0, 0.0, 1.0);

 void main()
 {
     highp vec2 textureCoordinateToUse = vec2(textureCoordinate.x, (textureCoordinate.y * aspectRatio + 0.5 - 0.5 * aspectRatio));
     highp float distanceFromCenter = distance(center, textureCoordinateToUse);
     lowp float checkForPresenceWithinSphere = step(distanceFromCenter, radius);

     distanceFromCenter = distanceFromCenter / radius;

     highp float normalizedDepth = radius * sqrt(1.0 - distanceFromCenter * distanceFromCenter);
     highp vec3 sphereNormal = normalize(vec3(textureCoordinateToUse - center, normalizedDepth));

     highp vec3 refractedVector = 2.0 * refract(vec3(0.0, 0.0, -1.0), sphereNormal, refractiveIndex);
     refractedVector.xy = -refractedVector.xy;

     highp vec3 finalSphereColor = texture2D(inputImageTexture, (refractedVector.xy + 1.0) * 0.5).rgb;

     // Grazing angle lighting
     highp float lightingIntensity = 2.5 * (1.0 - pow(clamp(dot(ambientLightPosition, sphereNormal), 0.0, 1.0), 0.25));
     finalSphereColor += lightingIntensity;

     // Specular lighting
     lightingIntensity  = clamp(dot(normalize(lightPosition), sphereNormal), 0.0, 1.0);
     lightingIntensity  = pow(lightingIntensity, 15.0);
     finalSphereColor += vec3(0.8, 0.8, 0.8) * lightingIntensity;

     gl_FragColor = vec4(finalSphereColor, 1.0) * checkForPresenceWithinSphere;
 }

and this filter can be run using a GPUImageGlassSphereFilter.

这篇关于iOS图像处于气泡效果中的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆