So I was making a settings gui window, and it worked totally fine in roblox studio, but when I went to play with my dev team, we noticed something strange. The blur was serversided, which meaned that if I turned the blur all the way to 100 with the gui, it'll show it to other screens, then if somebody would've turned it up it would automatically convert to his amount of blur. Which made me ask myself is this a bug or not? Script:
01 | add.MouseButton 1 Click:connect( function () |
02 | if script.Parent.BlockedFrame.Visible = = false then |
03 | sizeofblur = sizeofblur + 1 |
04 | game.Lighting.Blur.Size = sizeofblur |
05 | script.Parent.SizeBlur.Text = sizeofblur |
06 | if sizeofblur = = 100 then |
07 | add.Visible = false |
08 | elseif sizeofblur = = 2 then |
09 | Sub.Visible = true |
10 | end |
11 |
12 | end |
13 | end ) |
14 |
15 | Sub.MouseButton 1 Click:connect( function () |
Is your game FilteringEnabled? LocalScripts are able to edit serverside instances when it is disabled. There are ways of getting around this, though.
Usually when someone uses a blur in a GUI, they create the blur within workspace.CurrentCamera. This insures that it is only blurring out the local player's screen.
Doing this would result in the following code:
01 | add.MouseButton 1 Click:Connect( function () |
02 | if not script.Parent.BlockedFrame.Visible then |
03 | sizeofblur = sizeofblur + 1 |
04 | Blur = Instance.new( "BlurEffect" ) |
05 | Blur.Parent = workspace.CurrentCamera |
06 | Blur.Size = sizeofblur |
07 | script.Parent.SizeBlur.Text = sizeofblur |
08 | if sizeofblur = = 100 then |
09 | add.Visible = false |
10 | else |
11 | Sub.Visible = true |
12 | end |
13 | end |
14 | end ) |
15 |
Hopefully this helps.