XenTegra - IGEL Weekly

IGEL Weekly: IGEL Response to CISA Black Basta Cybersecurity Advisory

XenTegra Season 1 Episode 96

In response to the increasing cyber threats identified by the Cybersecurity and Infrastructure Security Agency (CISA), the Federal Bureau of Investigation (FBI), and the Department of Health and Human Services (HHS), including the aggressive ransomware campaigns by the Black Basta group, IGEL’s Preventative Security Model™ stands as a critical defence mechanism for the healthcare industry. This model prioritizes proactive prevention over merely reactive measures, ensuring that healthcare organizations are not just responsive but fortified proactively against sophisticated malware and ransomware attacks that exploit endpoint vulnerabilities. As threats like Black Basta continue to evolve, employing advanced tactics such as spear phishing and exploiting critical vulnerabilities within commonly used software, the emphasis on robust endpoint security and comprehensive threat prevention strategies has never been more crucial.

Host: Andy Whiteside
Co-host: Chris Feeney

WEBVTT

1
00:00:02.560 --> 00:00:17.600
Andy Whiteside: So everyone welcome to I, Joel Weekly. I'm your host, Andy White. So I've got Chris feeding with me, Chris. Now we're having a really passionate conversation just now about the industry. So now we're gonna we're gonna reset our gears and talk about this blog here from Jason Moffara. Does I say that right?

2
00:00:18.363 --> 00:00:19.169
Chris Feeney: Mafara. Sorry.

3
00:00:19.170 --> 00:00:45.169
Andy Whiteside: Sarah, I think I get that wrong almost every time. So sorry, Jason. The name of the blog is from well, 1st of all, it's for May 15, th 2024, and the name of the blog is igl response to Cisa CISA. Black basta cyber security advisory. And specifically, we're talking about healthcare as our example here, but not limited healthcare by any means. Before I do that, though, let me. Let me throw this out there, if you're an igl customer or looking to be which you should be.

4
00:00:45.430 --> 00:00:58.329
Andy Whiteside: and my main message to you is, you gotta stop running windows on the endpoint. If you don't need windows on the endpoint. If you just do that, your security posture will go through the roof in terms of functionality.

5
00:00:58.330 --> 00:00:59.529
Chris Feeney: How many years.

6
00:00:59.940 --> 00:01:00.270
Andy Whiteside: And your.

7
00:01:00.270 --> 00:01:00.980
Chris Feeney: Interesting.

8
00:01:00.980 --> 00:01:04.206
Andy Whiteside: Level. Sorry, Chris. I got an echo also. I'm not sure where that came from.

9
00:01:05.103 --> 00:01:14.820
Andy Whiteside: but so, Chris, why did we choose this blog today. And and why does it matter that that people understand how Igel feels about the the advisory announcement.

10
00:01:15.720 --> 00:01:27.500
Chris Feeney: So. I mean this year. Just seems like, I mean, it's been around for a while these attacks on healthcare. But there's been some very large ones just this year alone, and and most recently another one

11
00:01:28.376 --> 00:01:37.620
Chris Feeney: with ascension healthcare, I believe. And prior to that, the change healthcare one, you know, and obviously

12
00:01:38.164 --> 00:01:49.015
Chris Feeney: there have been similar challenges in other parts of the world. The Nhs has been hit before over in the Uk, and so

13
00:01:49.770 --> 00:01:50.929
Chris Feeney: But the

14
00:01:52.170 --> 00:02:10.160
Chris Feeney: what's been interesting is obviously, you know, the sheer size and massive aftermath of what's happened when you get when you literally take these organizations down and patients lives are at risk, or, you know, just the whole underlying ecosystem of, you know, the payments stuff like that. And so

15
00:02:11.590 --> 00:02:20.749
Chris Feeney: obviously, I gel, you know, we didn't build a product to try to address this. It's actually been really built from the ground up

16
00:02:21.127 --> 00:02:35.222
Chris Feeney: to live in a world that was untrusted. And so so. But things naturally, as you might imagine is that I'm sure you probably looked at the way the world was when you decided to go off and build integrity to try to solve something. And then

17
00:02:35.520 --> 00:02:38.150
Chris Feeney: things began to come your way

18
00:02:38.535 --> 00:02:42.660
Chris Feeney: we kind of find ourselves in this position where you know.

19
00:02:42.680 --> 00:02:45.379
Chris Feeney: Joe's preventive security model is

20
00:02:45.700 --> 00:02:53.170
Chris Feeney: already built to try to address some of these things that are now being, you know, put out there from these government organizations. So

21
00:02:53.726 --> 00:02:55.304
Chris Feeney: and then most.

22
00:02:55.830 --> 00:02:58.139
Andy Whiteside: Is it fair to say Ijel was built

23
00:02:58.160 --> 00:03:07.719
Andy Whiteside: many, many, many years after windows was built, and you got to see all the things that they did right at the time, but is wrong for modern day standards specifically around security.

24
00:03:08.290 --> 00:03:15.549
Chris Feeney: You know, it's interesting. So I jell is about 20. Let me see, 23. I'm gonna go with 23 years old, roughly

25
00:03:15.942 --> 00:03:21.990
Chris Feeney: in in Linux is maybe about 33 years old. Roughly, I mean, we're about 10 years behind it. But

26
00:03:22.394 --> 00:03:23.970
Chris Feeney: so what is that?

27
00:03:24.630 --> 00:03:25.810
Chris Feeney: Early

28
00:03:26.530 --> 00:03:29.840
Chris Feeney: to late late 90 s. Perhaps something around that I mean. So

29
00:03:29.920 --> 00:03:33.660
Chris Feeney: windows 2,000 wasn't yet out. The Internet was becoming a thing.

30
00:03:33.951 --> 00:03:49.158
Chris Feeney: You know, people may be listening, you know, hopefully, there's some older folks like you and me that remember those days. But the younger folks, if you're listening we didn't grow up in a world that had the Internet always available, let alone the world network, let alone the ability to have

31
00:03:50.740 --> 00:03:53.880
Chris Feeney: networks compromised or computers compromised. But

32
00:03:54.300 --> 00:04:16.170
Chris Feeney: I gel certainly was built to live in an untrusted world as this began to remote computing. And so that untrusted world was basically in the Internet, essentially being on on a a thing like that, where you know somebody could be scanning if you're not using firewalls or whatever like that. And then, you know security posture wise was kind of built to

33
00:04:16.579 --> 00:04:24.420
Chris Feeney: be available up and running all the time very little risk to the endpoint. And so that was sort of the beginning.

34
00:04:24.560 --> 00:04:28.790
Chris Feeney: If you look back on some of our corporate history slides. That was kind of sort of the

35
00:04:29.308 --> 00:04:35.531
Chris Feeney: the focus was to build this read, only approach to remote computing. If you will.

36
00:04:36.110 --> 00:05:05.990
Andy Whiteside: So let's walk through this section. Preventive endpoint security for healthcare. All industries. Really. 3 key takeaways. 1, st one says, organizations running igles the endpoint. OS, significantly reduce risk of the endpoint of an attack vector preventing ransomware and other instance, basically what he's saying there is because it's not windows. And it's some version of Linux. The amount of attackable areas within the OS is greatly reduced because we're not really running a lot of random applications that haven't been vetted.

37
00:05:06.382 --> 00:05:14.720
Andy Whiteside: That's step one is. It's not windows. It's Linux, and it's a specific version of linux customized step. One. It's not windows.

38
00:05:15.380 --> 00:05:18.090
Chris Feeney: And as I and this frame ran through as I

39
00:05:18.721 --> 00:05:20.978
Chris Feeney: was just reminiscing about this

40
00:05:21.690 --> 00:05:23.410
Chris Feeney: from when I came to Igl.

41
00:05:23.430 --> 00:05:29.090
Chris Feeney: you know I heard about it, read about it, even played around with it every now and then before I came here. But then

42
00:05:29.140 --> 00:05:39.510
Chris Feeney: you get in front of customers where they're, gonna you know, really test it out. And I had many situations where all I did was just put igl on the device. Didn't didn't change any configuration, just sort of base

43
00:05:39.840 --> 00:05:46.340
Chris Feeney: out of the box, and then they would go off and do their scanning. And you know it. It came back always.

44
00:05:46.350 --> 00:05:47.990
Chris Feeney: you know, very

45
00:05:48.160 --> 00:05:53.989
Chris Feeney: high in terms of you know, I can't find anything what's going on, and that type of stuff, and then, of course.

46
00:05:54.200 --> 00:06:16.826
Chris Feeney: we can always layer in. You know what I call a base config, which turns off some things like, Hey, I I'm gonna turn off the ability to the setup menu, but right out of the gate it was already in a much better posture, you know, as far as like listening ports and stuff like that. Whereas with windows I mean just as an OS. You scan that there's all kinds of things running behind the scenes, ports open this, that, and the other

47
00:06:17.418 --> 00:06:28.619
Chris Feeney: and so it's not built to be, you know, hardened out of the gate. You have to harden it much before you even can find it in a in a usual res I gel was already built sort of to be handled that way, so.

48
00:06:28.620 --> 00:06:36.381
Andy Whiteside: And and and, Chris, I would say you can't. You just can't harden it. There's there's just it's so capable and so functional, which is. That's why it was built.

49
00:06:36.740 --> 00:06:56.869
Andy Whiteside: that anything that there's always ways to exploit it. There always will. In fact, let me look, there's honestly there's ways to exploit. I gel but, man, there, you would really have to work hard to find them, and they're not obvious, and they haven't been built to be exploited, whereas windows in theory was built to be exploited because you could do anything with it, and then you have to go in reverse engineer to to make it secure.

50
00:06:57.320 --> 00:07:06.209
Chris Feeney: Yeah. And we've had this discussion where you know, you've got remote access to the device, or you've got physical access. You know, physical access to the device.

51
00:07:06.380 --> 00:07:10.189
Chris Feeney: you know, if you don't have a lockdown config on it already.

52
00:07:11.530 --> 00:07:26.300
Chris Feeney: yeah, I mean you can. You can modify and change configurations. And so for us, for example, just out of the box, you know, you'll have access to the setup menu, and you can go in there and and modify it unless it's being managed, and you lock that down which you absolutely can do.

53
00:07:27.385 --> 00:07:28.360
Chris Feeney: So

54
00:07:28.651 --> 00:07:37.910
Chris Feeney: and so what I ended up doing was, I had sort of my security base line config that I would just layer in, and it kind of maps to a lot of the stuff that we talk about and they

55
00:07:38.020 --> 00:07:42.809
Chris Feeney: preventative security model. But it's just, you know, things like basic like.

56
00:07:43.380 --> 00:07:51.820
Chris Feeney: remove access to setup. Put a a a admin password on it. I mean just basic stuff like that that you would want to do in production anyway.

57
00:07:52.220 --> 00:08:17.099
Andy Whiteside: So you're getting into a couple of things. We're gonna cover in a minute. But the number 2 on the list of you know, 3 key takeaways integrate with multi factor, authentication solutions to reduce thread attack. So number one, it's not windows. It's a less capable operating system on purpose. Cause. It only does what you want it to do. And then number 2, you integrate it with multi factor, authentication, authentication period, like you were just saying, but then, multi factor, authentication, and you've just shut, you know. Not only the not only is it

58
00:08:17.280 --> 00:08:25.130
Andy Whiteside: less attackable, but you've put a lock, a dead bolt on the front door, that you have to have the key to the the knob and the dub bolt to get in.

59
00:08:25.520 --> 00:08:30.169
Chris Feeney: Absolutely so that could be a variety of different solutions. So with with OS,

60
00:08:30.270 --> 00:08:35.650
Chris Feeney: 11 we had the ability to handle smart cards with OS 12

61
00:08:36.051 --> 00:08:40.629
Chris Feeney: similar thing. You can but also integrate if you want to.

62
00:08:40.880 --> 00:08:47.649
Chris Feeney: before you get to the ideal desktop and anything. But behind that, you know, you can set up integration with

63
00:08:47.660 --> 00:08:53.329
Chris Feeney: intro id octa ping, for example, or or Workspace one.

64
00:08:53.670 --> 00:09:03.410
Chris Feeney: or in the healthcare sector. And you we've got improv. Obviously, you can, you know, force multi authentication before you allow access to what happens behind that afterwards.

65
00:09:03.730 --> 00:09:10.450
Chris Feeney: whether it's getting into a web browser or a virtual session. And so these are just part of Ijel's posture

66
00:09:10.810 --> 00:09:12.809
Chris Feeney: to our partnerships and integrations.

67
00:09:13.380 --> 00:09:33.406
Andy Whiteside: And then finally, the the 3rd key. Takeaways, if you are running hardware that has windows running it on top of it, it's not the hardware, I mean is is the hardware exploitable, probably at the firmware level. But it's so incapable of doing a lot that why would you bother to exploit that when you can actually exploit the more powerful operating system on top of. So what we talk about number 3 here is.

68
00:09:33.670 --> 00:09:54.530
Andy Whiteside: if you have an exploited machine. You simply can boot from, say, a USB drive where you've obfuscated the hard drive from the scenario where all the software is living. And now you're really running the hardware on top of a new operating system, Linux, and you've circumvented the problem, which was the amazing, the powerful, the unbelievable windows, which is what led to the problem. To begin with.

69
00:09:55.160 --> 00:09:59.240
Chris Feeney: Yeah, absolutely. I mean, I've seen it happen 1st hand, literally being.

70
00:09:59.962 --> 00:10:13.069
Chris Feeney: you know, while an attack is is occurring, an organization running around getting Udp pockets configured and plugging them into devices to in a preventive way, trying to prevent

71
00:10:13.180 --> 00:10:19.150
Chris Feeney: a the device being hit. But then, of course, if it is hit, you can bypass the hard drive.

72
00:10:19.270 --> 00:10:21.150
Chris Feeney: And we did a demo of this

73
00:10:21.524 --> 00:10:32.910
Chris Feeney: for example, at our disrupt conference we also did at the hymns conference. We had a windows laptop there that, you know we did a simulation of a ransomware attack, and then we just shut it down, booted to the Ud. Pocket and

74
00:10:33.230 --> 00:10:36.860
Chris Feeney: and came right back up in just a few, you know, few minutes. So.

75
00:10:37.650 --> 00:10:45.899
Andy Whiteside: So, Chris, we've got this this model here, the preventative security model, this diagram. We'll just go through it because it covers what's in the rest of the blog

76
00:10:46.208 --> 00:11:07.790
Andy Whiteside: and we'll just hit them pretty succinctly here. 1st and foremost, it's not windows we've established that multiple authentication re capabilities which you should, you should leverage, and then the ability to use it in a pinch. But in this diagram here we've got 1st of all, centralized management is the key to all of this. But at the endpoint itself we've got no local data.

77
00:11:08.430 --> 00:11:10.130
Chris Feeney: None other than a config.

78
00:11:11.810 --> 00:11:16.460
Chris Feeney: I mean, you're not saving word, docs. You're not saving local documents. You're not saving

79
00:11:16.880 --> 00:11:19.899
Chris Feeney: Pdfs or any of that stuff. None of that is happening.

80
00:11:19.900 --> 00:11:27.950
Andy Whiteside: Yeah, like you're you're you're you would have to enable some way to do that. And by default, that's not turned on or encouraged.

81
00:11:28.560 --> 00:11:34.000
Chris Feeney: Right. And even if you were somehow and on a reboot it's not gonna get saved.

82
00:11:34.100 --> 00:11:37.590
Chris Feeney: So but yeah. But at the end of the day.

83
00:11:37.700 --> 00:11:53.049
Chris Feeney: If a device gets stolen, take a laptop running eye gel on it, it gets stolen. There's no local configuration, no day. I mean, there's no data on there other than the config, which you can also lock down if you wanted to. But even if you weren't, if you didn't take that extra step.

84
00:11:53.540 --> 00:12:04.359
Chris Feeney: It's still just a configuration user would still have to know how to get to stuff from there. They're not. Gonna it's not gonna be like, you know. All of a sudden all of Andy's secrets about

85
00:12:04.991 --> 00:12:09.818
Chris Feeney: how to run an Airbnb are now compromised for the world to see, for you know.

86
00:12:10.626 --> 00:12:13.109
Andy Whiteside: Number 2. In here is read only operating system.

87
00:12:13.410 --> 00:12:14.030
Chris Feeney: Enough.

88
00:12:14.698 --> 00:12:24.101
Chris Feeney: Again. You know. One of the things that is removed out of the gate from is the ability to go and and try to download software that

89
00:12:25.581 --> 00:12:34.020
Chris Feeney: it might be built for Linux. Or if it's windows windows right? None of that exists. We we control all that but furthermore, their operating system is

90
00:12:34.390 --> 00:12:41.519
Chris Feeney: lockdown, so that the pieces that need to talk to each other, to communicate and effectively run as an OS are there?

91
00:12:41.830 --> 00:12:54.610
Chris Feeney: But you can't try to modify or change or write. I mean, it's you know. You put a config on it, and it's just reading that and then getting you the ability to do what you need to do from there from the world of end user computing. So.

92
00:12:55.487 --> 00:13:04.089
Andy Whiteside: The 3rd one here that I've got in the order I'm going is trusted. I'm gonna add the word limited, trusted and Limited on purpose, application Platform.

93
00:13:05.390 --> 00:13:19.080
Chris Feeney: So kind of going back to sort of for those folks that have been with Igl for a while, or know of it. In the prior approach we had the OS. Which we also called firmware.

94
00:13:19.100 --> 00:13:32.750
Chris Feeney: and it had the applications that we included into that and those were obviously vetted and trusted to make sure that they were built and and able to run on Ijel's operating system in the world of OS. 12

95
00:13:33.180 --> 00:13:43.439
Chris Feeney: those apps are now separated out and to a portal where you obviously, those have to go through a vetting process and and security review, and all that stuff, especially if

96
00:13:43.780 --> 00:14:02.859
Chris Feeney: partner is building them. But certainly if I jails building them, we have this mess mechanism to make sure that if that app is then installed on ijels. OS, this read only OS. That it is a secure, trusted application that is going to be verified to to run correctly, not just some 3rd party thing that can just all of a sudden do some various things.

97
00:14:02.860 --> 00:14:23.520
Andy Whiteside: Which in theory is a a kind of an idea behind Linux to begin with, because it doesn't have this wide application development that it does, but it's not intended to be. It's not intended to be windows, whereas windows from day one has been like it was literally wide open. The reason I'm a windows administrator by original training is because I could steal it and steal applications and run it on top of there and learn.

98
00:14:23.820 --> 00:14:27.570
Andy Whiteside: Can't do that with Linux. Well, you you can't do that with an enterprise worthy.

99
00:14:27.570 --> 00:14:28.830
Chris Feeney: Perfect process.

100
00:14:29.370 --> 00:14:31.489
Chris Feeney: Yeah, like, I said, it's

101
00:14:32.185 --> 00:14:35.110
Chris Feeney: we? I said. We've had the ability to take

102
00:14:35.900 --> 00:14:38.080
Chris Feeney: Linux apps, and then

103
00:14:38.110 --> 00:14:40.669
Chris Feeney: with the mechanism make them

104
00:14:40.740 --> 00:14:44.990
Chris Feeney: be able to run on igl in a read only fashion. The real key thing is

105
00:14:45.592 --> 00:15:05.267
Chris Feeney: if I go and install it that way which we usually referred to as our custom partitions. You do it in such a way that a it's gonna interact with the operating system correctly. But then on a reboot. It will, it will retain, you know, and then still be there. But in order to be able to do that correctly it has to go through this trusted application kind of process that we're talking about.

106
00:15:06.040 --> 00:15:20.380
Andy Whiteside: Alright number. I don't know. 4 on the list. Here is Mfa. And single sign on integration. We've kind of covered that already, but there seems to be a certain there's some some guidance at igl that that's gonna be a high priority part of the part of the stack.

107
00:15:20.940 --> 00:15:29.542
Chris Feeney: I think it comes down to like, what do you want the user experience to be? So I I do want them to authenticate before they get to the OS underneath.

108
00:15:30.090 --> 00:15:39.460
Chris Feeney: and if so, then Mfa. Is probably your way to go, or some kind of single sign on component or just boot. Get to the OS

109
00:15:39.690 --> 00:15:40.750
Chris Feeney: desktop.

110
00:15:40.870 --> 00:15:47.020
Chris Feeney: and then from there you go to launch your session. Obviously, you know, you want to want to put some Mfa around that

111
00:15:47.846 --> 00:15:49.160
Chris Feeney: and so

112
00:15:51.040 --> 00:15:55.875
Chris Feeney: You know whether it's a browser going to. I think you had it up earlier.

113
00:15:56.630 --> 00:16:03.640
Chris Feeney: you had a browser that would take you to applications stuff like that. Or maybe it's administrator, console, whatever it is. But you know there's some sort of

114
00:16:03.930 --> 00:16:10.207
Chris Feeney: you know, in the fee involved, you know, 2 factor off, or what have you?

115
00:16:10.670 --> 00:16:14.390
Chris Feeney: And just native integration with those things? So that's what we've got there.

116
00:16:16.033 --> 00:16:21.849
Andy Whiteside: I'm curious about this next one. Data encryption is that local, or is that in transit, or both.

117
00:16:22.260 --> 00:16:27.949
Chris Feeney: I think it's both. I mean, there's there is a component where you can put some encryption on the on the disk.

118
00:16:28.010 --> 00:16:37.260
Chris Feeney: and I happened to be in Germany a few a couple months ago, and I was asking them it was in a security meeting, and I asked them specifically, and it's what that is is.

119
00:16:37.520 --> 00:16:43.470
Chris Feeney: It's nothing more than designing, you know. Hey? If I've got a configuration on the device.

120
00:16:43.720 --> 00:16:50.940
Chris Feeney: you can do some disk encryption, you know to protect the configuration of the device that's about. That's really about it.

121
00:16:51.626 --> 00:16:58.399
Chris Feeney: And there were some industries or customers, whatever that wanted that extra layer of security. So it's like.

122
00:16:59.200 --> 00:17:02.059
Chris Feeney: I definitely call it belt and suspenders approach.

123
00:17:02.140 --> 00:17:05.700
Chris Feeney: whereas the belt by itself definitely gets the job done.

124
00:17:06.109 --> 00:17:23.630
Chris Feeney: Suspenders is maybe just in case something happens. But it's really just protecting the configuration. It's not like, there's data on the device like we talked about earlier. It's just the configuration of the device. So that's on the on the endpoint. And then, of course, all the traffic in between where you're going is is certainly encrypted.

125
00:17:24.940 --> 00:17:36.370
Chris Feeney: you know. Like, if you're going to Citrix or horizon or azure, whatever it might be. You know those connections, I mean in today's world. They're not encrypted. Then you shouldn't be doing them, but they are so.

126
00:17:37.614 --> 00:17:46.419
Andy Whiteside: And then finally, the last one. And I'm curious how this this one applies to the the no local data as well as the encrypted, and that's the modular design approach.

127
00:17:47.120 --> 00:17:57.322
Chris Feeney: Yeah. So that's like I said in the in the OS 11 world, everything was sort of built into that firmware. And then you could kind of enable the things you wanted to, but you also had the ability to remove things you did not need.

128
00:17:57.630 --> 00:18:04.069
Chris Feeney: So if you were a Citrix shop, and you didn't need the horizon components built into the firm. You could remove those.

129
00:18:04.170 --> 00:18:07.910
Chris Feeney: and then the process would be just. It removes them out of that image

130
00:18:07.930 --> 00:18:24.020
Chris Feeney: in the OS. 12. It's the OS. And you layer on the pieces that you need and so, if you only need certain apps and none others. It's it's kind of that modular design. But even inside the operating system itself, the way we've kind of built it out. And we talk about this through our

131
00:18:24.140 --> 00:18:32.570
Chris Feeney: some of our security documentation. There's a link here to a security white paper that goes into more detail. But it's basically just saying, hey, these modules

132
00:18:32.640 --> 00:18:37.230
Chris Feeney: inside the OS itself. They're core. And you know, they're modularized so that

133
00:18:37.250 --> 00:18:52.728
Chris Feeney: you know they one doesn't kind of take over another whatever. But then, of course, you then layer in the apps, which are also sort of in their little buckets. Designed to interact with the parts of the OS that they need to. So, for example, I need to put some certificates on the device.

134
00:18:53.100 --> 00:18:56.599
Chris Feeney: So we put those in a specific folder or a

135
00:18:57.090 --> 00:18:59.030
Chris Feeney: partition.

136
00:18:59.060 --> 00:19:13.759
Chris Feeney: I should say, and then you bring in a browser. Then you bring in citrix app or horizon app, or and they need to know how to talk to and find those certificates, you know. So they're built to know exactly where in this modular design go to find them.

137
00:19:13.790 --> 00:19:15.789
Chris Feeney: Yeah, that that type of approach.

138
00:19:16.550 --> 00:19:23.709
Andy Whiteside: So so Chris to kind of sum up this, podcast. I would just say, from a security perspective, specifically from a security perspective.

139
00:19:24.580 --> 00:19:30.010
Andy Whiteside: when I meet with somebody? I asked them in the various use cases and scenarios we're talking about.

140
00:19:30.440 --> 00:19:36.099
Andy Whiteside: Are they running windows on the endpoint, and and why? I'm not saying it's wrong. Just just tell me why.

141
00:19:36.180 --> 00:19:45.439
Andy Whiteside: And there's a blend between you know, it's just what we do. I don't know any better and user experience. But then you gotta factor in the security one.

142
00:19:45.620 --> 00:20:04.859
Andy Whiteside: And I literally want every customer and every single workflow and use case to tell me why, they're running windows on the endpoint versus some type of delivered model which could just be a browser and a more secure enterprise worthy, manageable. Could we go back to the center of this design. And that's a, you know, a management console that has control over all this environment.

143
00:20:04.860 --> 00:20:23.470
Andy Whiteside: And it's not this crazy, you know, distributed architecture that Microsoft has, which was great in the beginning. When we hadn't had networks like we had then. But you just don't need it anymore. Why wouldn't? Why? Why are people running windows? They need to be able to answer that question. If they can't answer that question. Somebody needs a challenge.

144
00:20:24.270 --> 00:20:28.337
Chris Feeney: Oh, I think, especially on the healthcare side. But but it's it's

145
00:20:28.930 --> 00:20:32.150
Chris Feeney: We've talked about this in general many times on on this podcast

146
00:20:32.250 --> 00:20:35.909
Chris Feeney: at the end of the day, the users needing to get to an app

147
00:20:36.080 --> 00:20:37.280
Chris Feeney: of some kind

148
00:20:38.215 --> 00:20:40.980
Chris Feeney: and then it's the user experience around that

149
00:20:41.260 --> 00:20:52.940
Chris Feeney: you know, for example, do you put a front door on that experience where they have to authenticate first, st and then to make it easier, they just click on something, and they kind of get logged in that single sign on experience. Or

150
00:20:53.080 --> 00:20:54.210
Chris Feeney: if you don't

151
00:20:54.270 --> 00:21:04.980
Chris Feeney: need that, you just have them pull up whatever's available. And then they authenticate. They get to their stuff. You know. An OS. Is an OS. At the end of the day. If I get an ipad

152
00:21:05.890 --> 00:21:11.229
Chris Feeney: I'm not getting the ipad because of the cool operating system I'm using apps on that ipad.

153
00:21:11.310 --> 00:21:15.929
Chris Feeney: you know. And so, understanding like, wh, why are using windows as your endpoint. OS,

154
00:21:16.270 --> 00:21:21.160
Chris Feeney: well, I need them to get to this out of the other. Okay, can that be done just through a browser?

155
00:21:21.847 --> 00:21:26.739
Chris Feeney: Do you have Mfa in front of that. I mean, just all kinds of things, right and just rethinking

156
00:21:26.950 --> 00:21:33.240
Chris Feeney: user experience. Right? Does everybody need to have the same experience or need to have certain user personas, whatever

157
00:21:33.450 --> 00:21:34.283
Chris Feeney: you know.

158
00:21:34.790 --> 00:21:41.969
Chris Feeney: And you've done a really amazing job of kind of trying to challenge that. And that's obviously what what I just focused on is this.

159
00:21:42.300 --> 00:21:48.109
Chris Feeney: you know this disruptive nature of rethinking how you are doing endpoint management.

160
00:21:48.440 --> 00:21:52.420
Chris Feeney: And if you're not thinking about security right out of the gate. You're you're doing it wrong.

161
00:21:52.610 --> 00:21:53.420
Chris Feeney: So.

162
00:21:53.420 --> 00:21:56.900
Andy Whiteside: I mean, it's fine. The answer is windows. But there should be the reason why you're running windows, not.

163
00:21:56.900 --> 00:22:01.931
Chris Feeney: Yeah, I mean, that's saying there's gonna be legitimate scenarios where? Yeah, we do need. Okay, it's fine

164
00:22:02.895 --> 00:22:06.869
Chris Feeney: But there's plenty of other situations where you may not need to, so.

165
00:22:08.380 --> 00:22:12.149
Andy Whiteside: Well, Chris, I think we covered it. Short messages.

166
00:22:12.380 --> 00:22:14.939
Andy Whiteside: security matters, user experience matters.

167
00:22:15.240 --> 00:22:23.309
Andy Whiteside: enterprise, management abilities, matter. Igl has a solution for that entire stack. But from this perspective, from a security perspective.

168
00:22:23.990 --> 00:22:24.810
Andy Whiteside: it's

169
00:22:25.030 --> 00:22:26.669
Andy Whiteside: it needs to be considered.

170
00:22:27.100 --> 00:22:34.756
Chris Feeney: Yeah, absolutely. And I think, like every organization, I've had multiple conversations. You probably have, too. You you gotta find the right balance between security and convenience.

171
00:22:35.360 --> 00:22:43.219
Chris Feeney: and that's always been sort of the the dance, if you will. What is the right approach in healthcare? You've got to take into account

172
00:22:43.630 --> 00:22:46.670
Chris Feeney: how people use machines, how they use technology.

173
00:22:46.900 --> 00:23:00.970
Chris Feeney: You know, they're not always all carrying around a laptop that they don't share. There's shared workstations. There's non shared works. There's all kinds of scenarios, and finding when the best approach and and and then marrying the security and convenience factor.

174
00:23:00.970 --> 00:23:01.390
Andy Whiteside: Yeah.

175
00:23:01.930 --> 00:23:06.949
Andy Whiteside: Got one large healthcare customer now moving the entire organization. Thousands of people to apple.

176
00:23:10.240 --> 00:23:18.389
Chris Feeney: Somebody sold them on that. Who do I? I'm very curious. I'm sure there's a good reason for it or not.

177
00:23:18.390 --> 00:23:27.654
Andy Whiteside: It was A, it was a C levels idea. And he controls the purse strings. And that's what they're gonna do. And and to be honest from a functionality and from a security perspective. I don't know that I fault them for that.

178
00:23:28.328 --> 00:23:32.490
Andy Whiteside: The cost is just going to be astronomically high to maintain.

179
00:23:32.490 --> 00:23:37.139
Chris Feeney: It's interesting. So James Millington, he he tells a story

180
00:23:37.210 --> 00:23:51.799
Chris Feeney: before he came to Ijel. He's at another organization, and they just use a browser for everything. Just a browser they're using Google Apps or whatever. But yet they sent him a really expensive Mac which he technically didn't need. He just needed

181
00:23:52.250 --> 00:23:56.540
Chris Feeney: something that was functional that would help him open up a browser and do his job.

182
00:23:57.440 --> 00:24:07.079
Chris Feeney: But it didn't require a really nice expensive Mac and and Macs are nice. So are other devices. But yeah, so that'll be interesting to see.

183
00:24:09.080 --> 00:24:09.910
Chris Feeney: But

184
00:24:10.210 --> 00:24:14.989
Chris Feeney: sometimes you have situations like that where you know. And Max, secure.

185
00:24:15.150 --> 00:24:19.260
Chris Feeney: Not completely 100%. But you know it's secure

186
00:24:19.500 --> 00:24:24.420
Chris Feeney: anyways. But Ijel is very secure as well, and and built that way from the ground up. So.

187
00:24:25.420 --> 00:24:27.989
Andy Whiteside: Chris, I'll let you go. Thanks for taking time to cover this today.

188
00:24:27.990 --> 00:24:29.810
Chris Feeney: Always a pleasure, Andy. Thank you, brother.

189
00:24:29.810 --> 00:24:31.226
Andy Whiteside: See you in 2 weeks.