Sorry your browser is not supported!

You are using an outdated browser that does not support modern web technologies, in order to use this site please update to a new browser.

Browsers supported include Chrome, FireFox, Safari, Opera, Internet Explorer 10+ or Microsoft Edge.

AppGameKit Classic Chat / GetSpriteHitCategory SetSpriteCollideBit - does the even work?

Author
Message
nz0
AGK Developer
17
Years of Service
User Offline
Joined: 13th Jun 2007
Location: Cheshire,UK
Posted: 8th Aug 2017 22:40
I've tried this and can't get it to work other than the default settings.
Anything other than bit 1 settings don't work

Setting sprites with all bits:

SetSpriteCategoryBit(o,0x1111,1)
SetSpriteCollideBit(o,0x1111,1)

only work if testing category 1, which is the default anyway.
I've tried multiple permutations and the same.
Markus
Valued Member
20
Years of Service
User Offline
Joined: 10th Apr 2004
Location: Germany
Posted: 9th Aug 2017 01:26
print(str(0x1111))
print(str(%1111))
AGK (Steam) V2017.07.19 : Windows 10 Pro 64 Bit : AMD (17.4.4) Radeon R7 265 : Mac mini OS Sierra (10.12.2)
Scraggle
Moderator
21
Years of Service
User Offline
Joined: 10th Jul 2003
Location: Yorkshire
Posted: 9th Aug 2017 07:56
As Markus points out, preceding your number with 0x means treat it as hexadecimal but you are trying to use binary so you need to ditch 0x and use % instead
nz0
AGK Developer
17
Years of Service
User Offline
Joined: 13th Jun 2007
Location: Cheshire,UK
Posted: 9th Aug 2017 17:58
Duh. I knew it would be something like that. Must have been on the 2nd glass of wine by then....

cheers

Login to post a reply

Server time is: 2024-09-30 05:20:53
Your offset time is: 2024-09-30 05:20:53