>> next officials testifying on the use of biometrics and its impact on ibc. they are asked about the use of facial recognition technology and agassi risk. the hearing before the house of committee on investigations and oversight. it is just under an hour and a half. >> today our focus will be on how technological solutions can secure our argosy while allowing us to enjoy the benefits of biometric tools. biometric ibc and technologies can and should be implemented along with biometric technologies. so-called be pets and they can be impacted -- implemented at the -- improving the precision of collection to ensure they are not using features that are not necessary for use. they can insert, for example, austro patients on data collected. -- unintended uses. a technique called protection -- can ensure that one system biometric information is encrypted such that it cannot be read by another system. for example, someone's image obtained from the security systems at a doctor or psychiatrist's office, for example, cannot be inked to the workers workplace identity verification system. there'll agencies including -- represented at this hearing today as well as dhs of science and technology director are already working to develop and improve privacy detecting technologies for biometrics. -- future proof the government's standards for biometric application systems and invest in privacy technologies. i look forward to hearing from our panel of how we can further invest in these sections as biometric technologies become more and more prevalent in our daily lives. and the timing of our discussion in today is notable. the supreme court has recently and substantially weekend the constitutional right in their recent decision in overturning rover wade -- roe v. wade. to improve the if biometrics can prove where someone has been and what they did when they were there. and the parties can access biometric information by bounties being offered by some states to enforce their new laws. that makes biomed protecting biometric data more important than ever. finally, i want to observe that some of our witnesses testimony came late for this hearing and i apologize to the other members of the subcommittee that we did not have the usual amount of time as we normally would like to have had. and, the chair will now recognize ranking member of his committee -- subcommittee for an opening statement. >> think you very much chairman, foster. good morning everyone. i am exciting about our hearing this morning. the benefits and risk of biometric technologies and exploring research opportunities in these technologies -- i am hoping that this hearing turns into a productive discussion that helps us earn about ways to improve biometric technology in the future at the same time, protecting people's rights. in privacy. i was reflecting this morning on the fact that biometric technology has changed the way we lived our lives. this morning, use facial recognition to open my phone. i use the fingerprint reader on my computer to open my pack book. when i got my car this morning, to come to the district office, the car recognize my face to set the seat settings and when i was driving, it use facial recognition to make sure i was paying attention to the road. that is in the first couple hours of the day. it is definitely changed our lives and it is amazing to think that this was once the world of science fiction and now, we take it completely for granted. obviously, biometrics a lot of benefits to our daily lives. and, we want to make sure that we are able to continue to allow those benefits while protecting the privacy of the people that rely on the biometrics. for that reason, i am particularly grad -- glad -- that he is here with us today to talk about the work that we are doing in actors. he has been working in research and development in metric since -- for over 60 years. they have had an incredible role to play in developing standards for biometric. and i am hoping the same way that they helped the fbi establish standards for fingerprint technologies in the 1960's that they will continue to take a leadership low -- role in establishing the international and national level of biometrics today. these standards are going to be critical to enabling the change in diet -- biometric data between embassies in their systems as well as providing guidance for how those biometric systems are tested and how performance is measured and how assurances are made that data is shared securely and privacy is protected. that is important because, as we all know, biometrics are no different than any other advanced technology. and that they have beneficial uses also the misuse and harm individuals and harm our society. in this case by compromising the privacy of individuals or the security of their information. so, as policymakers, you need to be acutely aware of not only the benefits that these biometrics have two our society, but also the risks associated to the technology. especially, in my opinion, when it comes to the covert collection and the issue of individual consent to have one's information stored and used. i think, as policymakers, we have to balance that awareness against the potential benefits that biometrics bring to society. you could easily imagine us a different approach to regulating biometrics -- which would lose all the benefits. that we enjoy from biometrics. i am not just talking about locking our phones or setting the seats in our cars, biometric technologies really have extraordinarily helped applications. if give you examples, in ukraine, the defense ministry is using ai recognition and technology to recognize russian assailants and identify combatants. analytics tool and traffic jam using facial recognition in i.t. to detect patterns in trafficking to help law enforcement identify bins and sex trafficking. if we were to take a heavy-handed approach to regulating biometrics we would lose out on those life-saving applications as well. and that is something i have -- first-hand experience about. serving as a -- before is in congress i was in the caliphs -- california state legislature. i was on the board for privacy and consumer regulation. -- i could tell you, we saw a lot of bills that were misguided proposals and they could have effectively had facial recognition technology altogether. -- that it is clear that it easier for us to push legislationnm looking forward tg about their work today and to hearing from our witnesses. thank you for convening the hearing. i look forward to the discussion and yield back. >> i am very much envious of the car you must be driving with all of those features. i -- without stifling the innovation that is going to lead to future benefits of society. looking much forward to learning about their work today and hearing from our witnesses. thinking chairman foster i am very much looking forward to the discussion and i yield back. thank you i have to say i am envious of the car you must be driving, with all those features, it must be, i wager that you are probably not driving around in 18 year old ford focus. >> actually, that technology is coming to inexpensive cars as well. >> that's right. we will accept additional statements to the record of his time. i would like to introduce ms. right. she is the director of science technology and assessment analytics it -- analytics. she oversees federal funded research and federal efforts to commercialize innovative technologies and enhance u.s. economic competitive benefits. since joining us in 2004, she has read -- written reviews on a wide range of policy views. -- after ms. wright, is dr. charles remain. dr. remind is a director of information technology laboratory ideal. it is one of six research laboratories within the national institute of standards and technology and he oversees a research program to cultivate truck by developing standards and testing for operability. securing the reliability of information assistance. our final witness is dr. early ross. it is -- user director and the department of computer science and engineering area he also serves as the director of the -- center for identification technology and research. his ax variance in biometrics, computer vision, and machine learning. he has advocated for responsible use of biometrics in multiple runs. including the nato advanced research workshop -- workshop on -- security. each of you will have five minutes for your spoken testimony, your written testimony will be included in the record of the hearing. when you've completed your spoken testimony, we will begin with more questions. each member will have five minutes to question the panel and if time permits we may have two rounds of questions for our panel. so if we start with ms. wright, -- >> i think you are on mute i'm afraid. >> thank you. chairman foster, ranking member -- of the subcommittee -- the technology which mentored -- is used to compare facial images from a photo and video of identification and verification. as a technology has continued to rapidly advance, its use has expanded in both commercial and government sector. today, i would share highlights on how agencies are using facial recognition and federal efforts to access and mitigate privacy risk. last year we reported on the results of our survey of the 20 largest agencies and their use in facial technology. 18 agencies reported this technology the most common used was in smartphones vied by agencies. there were other embassies that included law enforcement to generate leads for criminal investigations as well as monitoring or controlling access to building a facility to for example identify someone on a watchlist as being -- greatly reducing the burden on security personnel to memorize faces. federal agencies may own their own systems or access systems in dayton and local governments or commercial providers -- or example, dhs reported using ai to identify victim traders and child expectation cases. agencies are investing in research and development to further understand their application of technology. or example, dhs's science and technology directory which offers challenges for the industry to develop systems. one recent challenge was to collect images of individuals wear a mask. -- in fact, during the course -- multiple agencies at the pool their employees and discovered they were using nonfederal systems even though the agency initially told us otherwise. it can put agencies at risk of running without the privacy related -- guidelines. there are -- unlike a password that can be changed according -- -- may have more serious consequences as -- agencies need to improve their process of recognition systems used by their employees and assess their risk of such systems. agencies are in various stages -- tsa and cdp. we've found that tsa had incorporated privacy protections for its pilot program testing the use of the technology for traveler identity verification and airport security checkpoints. however city privacy notices to inform the public of facial recognition can be used and his biometric entry exit program was not always can we. further some had not conducted audits of the commercial airline and airport art nurse to ensure compliance and cdp zone requirements and restrictions for maintaining or reducing traveler photos. fully implementing our recommendations will be in step two protect everyone's information. in closing, facial recognition technology is not going away and the demand for it will likely continue to grow. as agencies continue to find technology to use their mission. -- privacy protections will continue to be important. mr. foster and other members of the subcommittee this concludes my remarks. i would be happy to answer any questions you may have. >> thank you. next is dr. -- >> chairman foster, ranking member, and distinguished members of the subcommittee, i am charles -- national institute of standards and technology known as nist. thank you for the opportunity to testify today on behalf of is gone our efforts to affect the seal indications of biometrics technologies. this is home to five nobel prize members with a focus on national you already such as artificial intelligence, infect -- the digital economy, quantum information and science, biosciences and cybersecurity. the mission of nist is to perform -- promote innovation and -- in our information technology laboratory we work to -- we trust in the digital economy is built on key principles like cyber security privacy, interoperability, equity and avoiding bias in the deployment of technology. nist conducts fundamental and applied research and advances standards to understand and measure technology and develops tools to evaluate its measures. technology standards and its foundational research today enables -- are critical -- with robust collaboration, the stakeholders across the government, industry on the international bodies and academia, this aims to cultivate trust and foster an environment to foster innovation on a local date -- scale. -- the goal has been to develop security by applying engineering principles -- tools and standards that protect privacy and by extension so will liberties. the ability to conduct further -- is essential for organizations to select effective mitigation efforts including -- modeled after our highly successful security framework the privacy framework is another tool -- developed -- it is intended to support organizations decision-making and product and service design and deployment to optimize beneficial uses of data while -- minimizing adverse -- and society as a whole. since 1980, this is core dated the icl standard and data format for the interchange of imprint, facial and other biometric information. for the change in biometric and law enforcement implications. to include face, voice, and dna. the standard is used globally by law enforcement. homeland security, dimmitt -- defense, and other identity management systems and developers to ensure that biometric input information interchanges are in operable and maintain system integrity and efficiency. since 2002, it is also did element of international standards for data and civil app nations including id cards including a passports. different uses of biometrics, for example, authenticators to protect sensitive data or convenient entry solutions for fraud -- consider which privacy protected outcomes are suitable to their uses. the research on privacy enhancing technology that this conducts and the guidelines and standards that is publishes else organizations have effective mitigations of really tailored to mid of -- mitigate risk. -- human autonomy and dignity as well as civil rights and civil liberties. this has prioritized -- standards that protect privacy. in addition to maintaining privacy framework, this also includes ivc considerations in many security guidelines as well as the draft ai risk management remark. thank you for the opportunity to present on this activity and privacy and enhancing technology for biometric applications and i look forward to your questions. thank you. and after dr. will remind us dr. ross. >> chairman foster, the ranking member -- and esteemed members of the subcommittee, i am grateful for the invitation to testify today. consider this to be a great privilege and an honor to gauge in the power that so graciously serves our nation. biometrics is a valuable technology that is brought technologies in a number of different debates, however, it is necessary to ensure that the privacy of individuals is not individually compromised when the biometric data are used in a certain application. the purpose of my testimony is to communicate some of the ways in which the privacy of the biometric data of individuals can be enhanced and thereby physically mitigating the response will use of this powerful technology. first, the benefits of biometrics, the need for determining the identity of a person is critical in a vast number of application. ranging from personal smartphone to border security from self-driving vehicles to -- tracking file child -- the personalization of customer service. biometrics is increasingly being used in such applications. for instance, many smartphones imply automatic face or fingerprint recognition for unlocking and payment purposes. this increased use of biometric technology is being driven by significantly improvement and recognition and accuracy of the system over the past decade. indeed, the phenomenal risk of the paradigm of the learning based on -- this brings me to my second point. privacy concerns associated with the technology. the face images of an individual can be linked across different applications using biometric technology and thereby reading the comprehensive file of that individual and in some cases unintentionally diverging the person's identity of that privacy was expected. another example, rapid advances in the field of sheen learning and ai have led to the develop of so-called attribute classifiers that can automatically extract information pertaining to age, sex, race, and health cues from an image. this can potentially reach the privacy of individual. one more example, a number of -- have been curated for researchers by scraping available facial emissivity -- images. concerns have been expressed about using these images for research buses. in principle, therefore, an anonymous face can be linked to one or more face images in a curated data set thereby potentially revealing the identity of the anonymous face. now, to my final point, how can biometric technology be responsibility -- responsibly -- while keeping private -- firstly, utilizing schemes such as encryption which not only ensures that biometric data is never revealed but that all congregations take place in the corrected domain. secondly, by engaging the pattern -- the distorted data can still be successfully used for recognition purposes. within a certain application. interpretable by a human. such cameras have been used in public spaces to ensure the acquired images are not viable or previously unspecified purposes. i must note that researchers in biometrics are becoming increasingly aware of the privacy and at -- ethical implication of the technology they are developing. the recognition accuracy is not the only metric ring used to evaluate the overall performance of a biometric system rather metrics related to security and privacy are also being increasingly considered. this shift in the culture is remarkable and bodes well for the future of the technology. thank you and i welcome any questions. >> we will begin our first round of questions. first on the prospects for secure and privacy preserving digital id. we are all aware of concerning aspects of biometric technologies, it's important to recognize that there are valuable uses for these technologies that can improve our lives into security. privacy protections must evolve along with capabilities so we can reap the benefits safely. our improving digital -- act and a bipartisan group of colleagues have called upon federal agencies to modernize and harmonize our nations digital identity infrastructure in large part by leveraging the biometric databases that individual states already have in place as part of their programs to support real id. additionally using the standards to make sure these tools are interoperable and can be used for presenting that identity both online and off-line in the privacy preserving weight. how could biometric technologies increase our privacy i making our identities more secure against theft and fraud? dr. romine: i appreciate the concern that you end the ranking member have on this issue. the guidelines that we have put in place for privacy enhancing technologies broadly speaking we have investments in our privacy engineering program related to understanding how we can develop new technology that can enhance privacy protections in many different aspects of technologies. that coupled with the guidance that we are updating today on identity management and appropriate protections for identity management technologies i think they are going to be certainly opportunities to improve the protections of biometrics information across the board through some of these updated guidelines. i look forward to discussing that with you and your staff. >> any broad implementation of a identification techniques would require broad implementation of privacy protective measures. how could the methods be strengthened, are they ready for prime time, things like homomorphic encryption, i'm still told there is a privacy budget that you have to enforce you can't interrogate this using homomorphic decryption, you can't just do it repeatedly without at some point revealing the underlying database. there must be limits to these. have we understood and hit the limits of these or is there a lot of work yet to be done to understand how effective it can be to exchange information between trusted entities without revealing everything? >> a very short time ago, homomorphic encryption was a theoretical idea whose performance was so unbelievably slow that it was not practical. since then, enormous strides have been made in improving the performance. i will say that these privacy enhancing technologies particularly using cryptography as a protection mechanism have enormous potential, but there is still a lot more work to be done in enhancing those to make them significantly practical. as you point out, there are situations in which even with an obscured database through encryption, if you provide enough queries and have a machine learning back and to take a look at the responses, you can begin to infer some information. we are still in the process of understanding the specific capabilities that encryption technology such as homomorphic encryption can provide in support of that. >> dr. ross, do you have any comments? particularly the idea that you can cancel your fingerprints in some sense. does that really work yet? dr. ross: thank you for your question. cancelable biometrics has been proposed as one way to preserve the security and privacy of the biometric data but also the ability to cancel one's biometric template. the way it works is as follows. let's say you have a fingerprint image. you subject it to some distortion using a mathematical function and the distorted image is used for matching purposes. in other words, if the particular image is compromised, you would just change the mathematical function. therefore, you cancel the original template now you generate a new fingerprint template based on the revised mathematical function. in principle, this can allow us to not store the person's original fingerprint but only the distorted version or the transformed version of the fingerprint. that's why the cancelable property which is really imported by using the transformation function. >> i don't want to use my time limits which has expired, but we should be able to get to a second round. i now recognize our ranking member for five bennetts. -- five minutes. >> i have been reflecting on the fact that when we talk about privacy, it is a non-binary ethical problem. you can't say that data is completely private or not. we are dealing with a strange continuum where we have to weigh the amount of privacy we are willing to give up against the potential benefit that we expect by giving up that privacy. it's a complicated thing. i would like to organize my questions around that, because i think that solving the problem is going to be key to establishing a regulatory framework of what is expected will be asked companies to protect privacy. i'm happy to see that gao participate in this hearing and it sends a powerful statement to those that we intend to regulate will be start with ourselves and government because obviously we interact with a lot of data from different users and we should be experimenting on ourselves on solving this problem before we expect others to solve it. i found your testimony very compelling. i was very alarmed when i read that 13 out of 14 agencies that you surveyed did not have complete information about their own use of facial recognition technology. i realized most of those were people using facial recognition technology to unlock their own smartphones. it may be think about the fact that maybe there is a difference between privacy when it comes to our own data, i'm using my face to unlock my phone, and the privacy when we are using other people's data especially will be have a large amount of data. we'll be do the surveys in the future, do you think we need to make a distinguished meant between those different kinds of uses? >> i think that's important, but the cases are refound agencies didn't know what their employees were using, it was the use of nonfederal systems to conduct facial image searches such as for law enforcement purposes. they didn't have a good sense of what was happening in the regional and local offices and that's what we think it's important for agencies to have a good understanding of what are the systems that are being used and for what purposes and to also make sure that by accounting for that, then they have the necessary tools to ensure that they are balancing the potential privacy risks associated with using the systems. >> all of these things, you are using commercial source for this kind of technology, has to go through procurement right? would procurement be a fruitful avenue to look at in terms of informing this flow of information? >> a couple of different scenarios, one in which agencies might have been accessing state and local systems or commercial systems through a text or trial. then there might be instances where they have an acquisition or procurement and placed. we have ongoing work right now looking at law enforcement use and the kinds of mechanisms they are using and acquiring systems from commercial vendors. i think that information will be telling for us to determine what privacy requirements are being put in place when agencies are acquiring services from these commercial vendors. >> dr. romine, i found it interesting in a written testimony when you were talking about the privacy framework that it's not a static thing. could you talk a little bit about how you would evaluate the fact that this has to be dynamic? part of it has to be based on use. if you're using facial recognition for verification, that is different than identification. users expectations on privacy are going to be different. how do you approach that ethical conundrum? dr. romine: that's exactly right . the context of use is critical to understanding the level of risk associated with privacy consideration. one of the things privacy framework is intended to do is give organizations the ability to establish privacy risk principles as part of their overall risk management for the enterprise. you talk about reputational risk and financial risk and human capital risk. privacy has not been included in that. we're giving organizations the tools now to understand that data gathered for one purpose when it is translated to a different purpose in the case of biometrics can have a completely different risk profile associated with it. it isn't inherent in the data, it is the context in which those data are being used. our tools allow for a deeper understanding on the part of the organization on that context issue. >> if get another round, i'm going to ask you about -- because that's going to go right into what we were talking about with framework. >> we will now recognize the next represented for five minutes. >> i have a couple of questions i want to touch on. this is the topic of conversation that has come up in oklahoma a couple of times on the stateside. you testified that most agencies accessing nonfederal facial recognition technology don't track use of -- or access related to privacy risks. is there any federal law that requires agencies to track this information? ms. wright: there is a broad privacy framework where you have the privacy act that does call for agencies to limit their collection as well as disclosure and use of personal information in the government system. a photo would be considered an example of personal information. you also have the e-government act which provides provisions for agencies to conduct assessments in their using systems and to be able to use those assessments to analyze how the information is collected, stored, and shared and managed in the federal system. only be noted that privacy requirements apply to any systems being operated by contractors on behalf of federal agencies. >> we haven't even talked about the contractor piece. i want to circle back around to your comment about these assessments. do you think that agencies are doing the assessments and if so, are those outcomes published so that other agencies can understand risks or the breath of what they are utilizing within the agencies? >> we have seen a mix of how agencies are approaching the privacy impact assessments. one of the things i mentioned was when you have agencies using systems and employees using systems and the agencies aren't even aware, there is the possibility that the risks have not been assessed and that's an important thing for agencies to keep in mind as they continue to use facial recognition systems. >> would it be helpful for congress to look at requiring these assessments to be done on a periodic basis for agencies that are utilizing these types of biometrics? >> the e-government act calls for agencies to do that but the extent to which they are doing that varies. that is work that we can talk about if there is oversight opportunities for them to look at the extent to which they are using privacy impact assessments especially in the realm of biometrics. >> what do you think some of the potential adverse consequences might be of agencies failing to track information either themselves or through third-party systems? >> a couple of things come to mind. are they using systems that have reliable data? that have quality images that will affect the sorts of matching results that will come back and the extent to which those can be trusted. you can see where there is the potential for high mismatch error rate which might be in a law enforcement example where you are taking down a lead that might not be fruitful or you might be missing an opportunity. that is one piece of it. the other piece is one where thinking about this from a privacy perspective, how are the images being collected and how are they being used and does the individual have any say? did they provide consent to their data being captured? there's a number of different risks associated. the issue of data security, are there systems secure? we have cybersecurity on the high risk list for many years within the federal government. you can imagine this opens the door for potential greater security breaches. >> sitting on the cyber subcommittee, i think you are exactly right. we talk about this from a data privacy perspective but also we need to recognize there is a huge potential for cyber security challenges when you're collecting these types of biometrics and storing them through third-party which in some cases can be more of an issue but certainly if agencies are storing that information themselves. my time is almost expired. i yield back. >> i believe we will have time for a second round of questions. i now recognize myself for five minutes. it is abundantly clear that the u.s. taxpayer has suffered greatly from identity fraud. irs refund fraud, unemployment benefit fraud during covid, you name it. has anyone to your knowledge inside gao or elsewhere netted out the total loss to the federal government from identity fraud that might be prevented with using state-of-the-art identity proofing mechanisms? >> that is certainly not something that came up in the course of the recent work we have done. i am not aware, but happy to take that back and follow up with you. >> i think we will be asking you what the scope of such a survey would be. there appear to be little bits and pieces of documentation of enormous losses that the taxpayer suffers from this. trying to get that balance right i think could be an important outcome. >> i am happy to do that. >> one of the tough things we will face as a government is sharing data with other government. biometric databases or regulating crypto where you will need to have uniquely identified crypto drivers license at if you will. very much like setting up a passport system. something where you have to identify that someone is operating multiple identities in multiple jurisdictions. dr. ross, are you familiar with the state-of-the-art and what might be useful there? are there investments we can make toward more research that would allow you to ask very sensitive questions of big databases owned by other states or governments? dr. ross: certainly and i think one concept that can be harnessed but has to be further researched is the notion of differential privacy. it would indicate that within certain jurisdiction, you are able to do certain identity assessments using biometrics and you have specific use cases, specific purposes in which identities can be matched, but in other cases the identity cannot be matched. by defining the policies, one could then use these principles that we alluded to earlier including homomorphic encryption and differential privacy in order to ensure that that kind of functionality can be performed. however, i must note that research is still in infancy in the context of biometrics and certainly more investment is needed in order to assess the suitability of this operational environment. further collaboration and investment is definitely needed to implement these techniques in operational environments. >> when you are involved in international standard settings which is part of this mission, do you get the feeling that the united states is leading the way or are there peers around the world that are as sophisticated technologically in biometrics and in privacy preserving methods? dr. romine: in the work we are doing on the international standards arena surrounding identity management, we certainly believe we are leading in that space. there are certainly other like-minded countries that are partners with us the value democratic ideals. we strive to work closely with them and they do have very strong technical capabilities in these areas as well. >> i have been struck that in some european nations, you have a right to know when any government accesses your data at least outside of criminal investigation. are these things that can be guaranteed or is that an unsolvable problem if you understand my question? i dream of some technology that would allow you with cryptographic certainty to know that someone has touched your data. dr. romine: it is certainly theoretically possible to use cryptography to address the concern. i wouldn't call it foolproof necessarily. the history of advancing technologies is colored with many different advances and risks and the risks are addressed by new technologies which create additional risks. the goal for us is to ensure the trustworthiness of the underlying systems and cryptography can be important there. >> dr. ross, did you have any thoughts on the feasibility of that as a long-term goal? dr. ross: yes and it's an excellent question. one thing it entails is keeping a ledger of interactions between humans and if the data being stored. for example, the block chain principle has been used to keep track of certain transactions that have occurred. these are immutable. i believe some of these principles can be leveraged in the field of biometrics, but i must maintain that or research is needed. more investment is needed. certainly, the technology is available. then, it has to be incorporated into the context of metrics. -- biometrics. >> my time is expired. >> dr., we were having the discussion about the continuum of privacy and how that works ethically with our efforts to regulate it. in your written testimony, you talked about the idea that privacy can be violated when the scope of help biometric data is used differs from the expectation of who provided it. that is ethically complex also. sometimes, there are societally beneficial uses. one we have been talking about with using clearview ai to halt sex trafficking. if you ask the people who are safe from sex trafficking, they didn't give permission for the use of their data in that context. but if you ask them if it's ok, they say yes. how do you navigate that minefield? dr. romine: that's a trick question. when you have acquired biometrics data for whatever purpose any organization that has acquired such data, these are not assets in their control. sometimes, the pressure to use those assets in ways that work originally intended is pretty enormous. the idea that we could do this instead of thinking should we do this with those data. that is one of the reasons why we always have to stress the importance of context of use. in some cases, new context of use may be enormous the beneficial and perhaps not even controversial. in other cases, it could be extremely potentially damaging. this is the difference between cybersecurity and privacy in the sense that it does not have to take place for her missy -- privacy harms to occur. using biometrics data in ways that were not intended and perhaps violate the expectations of those who provided those data can create those privacy events. >> i completely agree. i want to ask a question about that to dr. ross. you were talking about privacy violations that can occur with using facial recognition to infer racial sexual or health characteristics that were not provided by the person. how do you navigate that in an ethical sense? when i post a picture of myself on facebook and one of my friends looks of my friends looks at that and says he looks unwell, i can't then point my finger and say that's a privacy violation, i didn't intend for you to infer anything about my health, they would just roll their eyes. it's understood my picture is out there and those inferences can be made by anybody who sees it. why do we make a distinction between that when a human does it and when a machine does it? dr. ross: we are really distinguishing between human analytics versus machine based analytics. could have billions of images, you can run the software over these billions of images then make some assessments in the aggregate without use or consent. it is the ability to do this repeatedly over massive amounts of data then use that aggregate in order to perform additional activities that were not indicated to the user. that is where the problem lies. if the user were to give consent saying these images can be used for further analytics, i believe using the machine will be productive in some cases but in other cases as you point out, there might be a violation of privacy. i think it comes down to user consent and the fact that you can do this in mass so how do we do it in a manner that the person is aware of how their data is being used and that does not unwittingly glean additional pieces of information that might violate the privacy. >> i somewhat agree. the amount of data -- the distinction is not the amount of data that is progress -- processed. another question before a run of time, you talk in your testimony about privacy by design which i think is an elegant concept but consider me a skeptic because if you are using an algorithm's that distorts images in a way that sex or ethnicity cannot be read, we will run into the same problem that we did with cryptography where cryptographic algorithms developed 10 years ago don't work anymore because the computers are more powerful. as this technology gets better, aren't those algorithms not going to work anymore either? dr. ross: a point, very insightful. this is where more mathematics is needed as we start developing biometric technology and applying it. understanding what the privacy leakages are, information leakage and what is lacking is privacy metrics. privacy metrics is a moving target. if technology cannot deduce some attribute from a face image today, it might be able to do it tomorrow. was deemed to be private today may no longer be deemed to be private tomorrow. that is where the concern is heard this is where the technology evolves and these technologies must be revisited. it is not static in time. it is dynamic in time. as technology advances, these policies must evolve. also the metrics being used to evaluate must evolve. in short, i completely agree with your statement. some of the problems in cryptography can manifest themselves in other techniques, but it is not unsolvable. with adequate technology development specially employing mathematical transformation, i believe that a solution can be found. >> thank you for that. i yield back. >> i think there may be time for an additional round. >> i am just catching up to all of you and i will never be able to catch up to jay or bill on this subject, but stephanie i can at least talk about. i want to thank the panel. there was a word that you used, dr. ross, immutable. then you got into the conversation with mr. ober nolte about the fact that technology may make some of what we are trying to do today in terms of privacy and cybersecurity outdated tomorrow. it reminded me of a great oklahoman, will watchers -- will rogers. the only thing that is immutable is death and taxes. i guess my question is, and i'm really just a science-fiction person when it comes to this, thinking of minority report with tom cruise. you may have all kind of addressed that. every place he goes they know him already, and eventually he has to have his eye taken out because of this. i went and i bought an ipad holder from a company called weather tech the other day. we were in there for something else and i thought it looked good and bought the thing. all of a sudden i am getting ipad holder ads like crazy, and i didn't even look online. i just bought the darn thing. i just feel like i've got either big business looking over my shoulder or big government looking over my shoulder. i am making more of a statement than asking a question, but i guess i will start with you. is there anything, dr. romine was talking about privacy versus cybersecurity. what can we do in congress to ensure ourselves a little more privacy? >> i think a key important factor is how do we hold agencies accountable for the information they are collect thing? the purpose for which the information is being used, how it is being stored, shared, and destroyed our fundamental things to start with one we think about privacy. and to really think about what applications or use cases we think should be permitted or restricted, because i think then you will start to get a handle on where the concerns are with respect to privacy. at the end of the day it is about trade-off. while there might be some convenience factors and some security benefits as well, there is also the issue of privacy and being able to protect your personal information and that is where the tension lies. rep. perlmutter: there is also tension between the kind of privacy we may want from state or local governments versus the kind of privacy we may want from private enterprise. the thing that i ran into, it was a spontaneous purchase of this ipad holder, and all of a sudden i am getting ads about it . so you have two really sizable entities out there looking over your shoulder. i think we in congress need to think about both of those when we are thinking about particularly about privacy and cybersecurity. gentlemen, anybody have a comment to my general proposition here? it is not science-based, but it is personal-based. dr. ross: i would be happy to share some comments. i think the issue that you are describing is actually very important. mainly, exchange of biometric data. biometric data collected from one purpose can then be transmitted to another agency, to another entity which might use it for different purpose. one way in order to kind of prevent this, even before it happens, is by ensuring that when we store the biometric data in one entity that it is suitably encrypted. and when it is used in a different entity it is encrypted differently or transformed differently. what happens here is the two sets of data cannot be linked because they have been transformed differently. i think that becomes important. on the flipside it might prevent one agency from communicating with another agency because the biometric data cannot be linked. this is where use case specific qualities must be instituted. there are certain situations when it is acceptable and other situations like the one you described that are not acceptable. this is where economic development must be augmented with legislative instruments to engage the data in a manner that is appropriate in different use cases. rep. perlmutter: thank you, my time has expired. rep. foster: we will recognize representative bryce for five minutes. rep. brice: part of that i recognize the connections, it is re-marketing. your email is likely tied to your credit card in some way or you may have entered your email address when you checked out and your email is tied to social media. when they realize you purchased that they started marketing to you all kinds of things. that has been going on for some time, but it is for a lot of folks concerning wonder how did they know, how did they get this information? that is big data at its finest. in oklahoma, this last session we passed the computer data privacy act. the bill allows for the option for personal rights to be returned to the individual, along with the option for cancellation of the information in a private company's database. to me this seems like it could be a solution for privately collected biometrics data, but this is, to any of the witnesses here, what do you think are the most concerning aspects of developing biometric technology? dr. ross: i would be happy to offer some comments, if you don't mind. both parts of your excellent question, one of the most obvious concerns about biometric is the ability to link different data sets. that clearly constitutes a problem in some cases. in other cases it is an advantage. once again, as technology improves, as the recognition accuracy numbers improve, this kind of linking can be done with more certainty because the errors are decreasing. this is where quality for regulating the use of the technology becomes important. in some use cases it is essential to have the functionality. in other cases it may not be required. secondly, in response to your first comment, again, an excellent comment, when the user in a private enterprise offers their image, it would be nice if they can say for what purposes it can be used. if it is a face image they can say you can use this for biometric recognition, but it should not be used for assessing age or health cues. the moment they specify that, the data should be transformed in a manner that would facilitate the functionality prior to storing it in a database. this gives some degree of control to the user, because the user is now able to specify what kind of information can be cleaned and what kind of information should not be gleaned -- can be gleaned and what kind of information should not be gleaned. i think that is one important area in which more investment is needed. many techniques have been proposed, but these have not been evaluated. there is tremendous opportunity if we were to invest in this front, but excellent questions and thank you for hearing me. rep. brice: anyone else care to comment on that particular aspect? dr. romime: i would be happy to weigh in. some challenges involve the ability, as my comment said, as dr. ross said, to glean certain types of information and some of the potential societal harms or inequities that may result back to the ranking member's question about his facebook page and his friend seeing his image and saying you don't look very good. imagine if it was an insurance company saying you don't look very good and taking steps as a result of that assessment. those are the societal harms that we need to be wary of. rep. brice: i think this is a really great point, that use of biometrics is incredibly important and we need to be able to develop systems and controls to be able to allow for individuals to have some sort of say and how their information is utilized. thank you for your time witnesses today, and chairman, i yield back. rep. foster: we will embark on a final round of questions. i will recognize myself for five minutes. dr. ross, you seem to be coming close to describing something that resembles a licensing regime for collecting biometric data. say someone wanted to put a facial recognition camera in front of their nightclub to find people who have repeatedly shown up at the nightclub and caused violence. it sounds like a legitimate thing. if they start transferring that information around there are a bunch of issues that come up. are there standards -- this might also be a question for dr. romime -- are there standards for how you would license the collecting of the data and licensing the transferring the data so ultimately if you are holding biometric data on someone you would have to also be able to demonstrate a chain of custody that showed that you have achieved, obtained this only through a set of licensed distributors of data with customer consent at each point? have people gone that far? any country gone in that direction? dr. ross: thank you for your question, chairman foster. i will address the first question and my distinguished colleague will answer the second part. the first part, there is research that is being conducted in which privacy is being moved closer to the sensor than the d ata. once the data is acquired it is available. encrypted you can transform it, but someone has access to the data. what if we move the privacy aspect to the camera itself in such a way that the camera is designed in a manner that it can only extract or acquire specific aspects of the scene? that becomes very important, because know where the digital version of the full scene be available. for images even prior to storing them at the center level, it might be one way in which the scenario that you described can be handled because the data will no longer lend itself to be processed by a different organization or entity because the data was processed at the time it was acquired by the camera. that could be one technological solution, but as i mentioned earlier, these things have to be evaluated. so, much more research, investment, and evaluation are needed to substantiate these principles. rep. foster: will this ultimately require for some purposes basically a government-backed door? for example, if you have cameras looking at elevators to make sure that you are opening and closing the elevators as fast as possible where you only really have to detect the presence of a human, and all of a sudden you find a massive crime has been committed the government might want to go through trusted court -- a trusted court system and say bypass the observation, i want to see the person's face. dr. ross: the same data can be stored in different formats, different transformations, so that it can be used for some purposes and not for other purposes. i think that technology can be applied in order to transform the data in different formats, but individual formats should be guided by policy as to putin and should axis it and should not access -- guided by pauly -- guided by policy as to who should and who shouldn't access it. rep. foster: do you have any comments about when you engage with some of your foreign colleagues in this? do they face a very different set of attitudes as in the united states? dr. romime: certainly that is true. for example, as you know, the gdpr in europe envisions a different way of approaching protections for privacy than we currently have in the united states. that said, one of the reasons that the privacy framework that we have developed is regulation agnostic and even technology agnostic is we want it to be adaptable, usable around the globe and be able to provide assurance that if you follow these guidelines you have evidence to support you are complying with whatever regulatory regime you happen to be in at any given time. rep. foster: thank you. i will recognize representative obernolte for five minutes. rep. obernolte: a couple of interesting things have come up like how do we safeguard privacy? from a 30,000 foot view level? some things could work and some things probably won't work. dr. ross, you were mentioning disclosure. i used to think that that was a great idea, and then i started looking at end-user license agreements for software. there are pages and pages that people scroll through and click agree at the end. what good would it do to add another paragraph that says here's how we will use the facial data that you give us. there was an episode of south park a few years ago, a parity of one of the characters. i have inadvertently given apple the right to do experimentation on him. his friends were like, you clicked on that and signed it without reading it? who does that? the answer is everyone does that. i think maybe control over who has access to the data. if i give my data to apple for use for a certain purpose, the fact that apple should not give that dated to someone else to use very different purpose, i think -- that data to someone else for a different purpose i think is closer to the market. we won't find a real regulatory solution to the problem without looking at the things we are trying to prevent. what attorneys called the parade of horrible's. someone asked about that. we are entering into this era when anonymity is a lot less than it used to be. that will be true regardless of what approach we as a government take to privacy. can you walk us through the worst things that we have failed that, like the worst things that can happen? those are the ones we have to try to prevent. dr. romime: fair enough. i will say figuring out with the worst things are might take some time, but some of the things that i've alluded to is the idea of organizations making decisions based on inferences from biometric data that disadvantaged certain groups over others. rep. obernolte: let me stop you. we have had that problem. there is ethics around ai algorithms that we are dealing with. i think the solution is you focus on the fact that that behavior is already illegal. so, if i'm going to kill someone it is equally illegal for me to kill them with a knife and a gun. the tool doesn't matter, the act matters. why is that different in the case of privacy? dr. romime: i don't think it is so much different as it is a consequence of the lack of privacy or privacy compromise. privacy in this case or the compromise of privacy, a privacy event, would lead to that activity. there are other things that i can imagine. there are aggregates, societal decisions that are made that might be predicated on aggregate data that violates privacy consideration. policies may be instituted and culture certain populations as a result of certain issues related to privacy or biometrics. in all of these cases what we have discerned is that there is no technological solution to solve the privac problem -- privacy problem and no purely policy that will solve the privacy problem. it is providing and improving privacy protections and massing those -- matching those with appropriate policy that can prevent some of these tragedies. rep. obernolte: i agree with you, but i definitely think in crafting policies we need to look at, asking ourselves the question, what problem are we trying to solve, what are we trying to avoid? merely focusing on anonymity i think is a full's -- foo l's errand. we have less anonymity and there's not anything we can do about that. the parade of horribles, the government has the power that other entities don't. if you want the parade of horribles, look at what china does with the personal data that they have for people. that is the top of my list of the parade of horribles, but i don't think that we will get there from a policy framework standpoint without thinking of the problem we are trying to solve. it is a discussion i'm sure we will continue to have over the next few years. i yield back. rep. foster: we will now recognize our lawyer in residence for five minutes. >> i think mr. obernolte is focusing on the question of the day. i remember serving in the state senate 20 plus years ago. we were just trying to have an internet within the colorado legislature, and something came up and we were talking about social security numbers, should we release them, all that stuff, for privacy purposes. i was being cavalier and i said, "there is no such thing as privacy." your point, there is no such thing as anonymity. it has only grown in the last 30 years. the question is, from a policy perspective, technologically we can address things. as ms. wright says you give up some things to get some things. you can make it tougher for a cyber criminal or someone to use your data, but you are giving up some efficiency or ease of use in the process. the supreme court made several decisions, none of which i like. the one that i like the least is the reversal of roe v. wade, but they basically say under the united states constitution there is no such thing as a right to privacy. and i don't know. i want to feel secure that when i go buy something spontaneously that that doesn't alert everyone under the sun to something. or when i walk by a grocery store or gas station all of a sudden that doesn't send out in the neighborhood let's send him x, let's get him. this is for everyone, including my two colleagues. to jay's question, what are we trying to solve? what do we want? do we want to create a right to privacy that the supreme court says there isn't such a thing? we can legislatively say something like that. how far do we want to take it? then for the technologists, help us put that into play knowing that technology will evolve and change, and things that we thought were in place will be replaced. it is ed perlmutter thinking based on jay obernolte's line of questioning. ms. wright, as the director of the agency that thinks about this stuff, from a technology standpoint we can do some things if you guys give us clear direction. i think that bill is trying to do that on some of his digital legislation and jay has some stuff too. dr. foster, i will turn him back to you and you can do with my two minutes whatever you wish. rep. foster: that is an interesting -- you know, this is -- i will ask a question. so much of this will have to do with our cell phones. dr. romime, is there good coordination communication with the manufacturers of the cell phones? there is incredible ai built into the next generation of smartphones, but not all of it is in the secure enclave where you have some idea it is trusted computing. are you having thoughtful interactions, or do you get that they are just trying to set up a walled garden and keep everyone's privacy information under their control? dr. romime: we work with a very large cross-section of technology, including cell phone manufacturers and providers. having further reflection on ranking member obernolte's question about significant harms, one of the significant harms i can imagine is through cell phone tracking or face recognition. cameras, street cameras and so on. someone trying to access safe and reliable medical services, whether it is psychiatric or something else, suddenly that becomes public record. someone has now been outed because of biometrics information because it tracks the information trying to obtain services. this is another very serious potential issue. yes, we are working in discussion with cell phone manufacturers and other advanced technology firms all the time. rep. foster: thank you, again. we could go all afternoon on this. i suppose i have to close the hearing now, but before we bring the hearing to a close i want to thank our witnesses for testifying before the committee. it is really valuable for us in congress as we struggle with all of the policy issues on biometrics of privacy that we have access to real, quality expert so we can understand the technological reality of the feasibility of things and don't generate legislation based on wishful thinking than technical reality. the record will remain open for two weeks for additional statements from the records or additional questions that the committee may ask the witnesses. the witnesses are now excused and the hearing is >> c-span's washington journal everyday we take your calls live on the air on the news every day and discussed policy issues that impact you. coming up, tuesday morning, we talked about the future of health care and the affordable care act, with brian blais, a former trump administration economic and health policy advisor. then bob keith, executive director or an environmental ultra maneuvers talk about his new book, climate. watch washington journal, live it 7:00 eastern tuesday morning on c-span, or c-span now, our free mobile app. join the conversation with your phone calls, facebook comments, text messages and tweets. >> now available to c-span shop, c-span to 2022 congressional directory, go there to order a copy of the congressional drug, this compact a spiral-bound book as your guide to the federal government, with every information of congress including reos and committee assignments, including contact information for the biden administration cabinet. order your copy today or scan the code with your smart, every c-span shop purchase helps us up or c-span's nonprofit operation. >> c-span is your unfiltered view of government. we are funded by these television companies and more, including charter communication. >> broadband is a force for empowerment, that is why charger has invested units, infrastructure, upgrading technology, powering opportunity in communities big and small. charter, is connecting us. >> charter communications support c-span as a public service, along with these other television providers, giving a front row seat to democracy. >> listening to programs on c-span through c-span radio just got easier. tell your smart speaker, play c-span radio, and listen to washington journal daily at 7:00 a.m. eastern, important congressional hearings and other public affairs events throughout the day, weekdays at 5:00 a.m. and 9:00 eastern, text washington today for a fast pace for it on the stores today. sent to c-span anytime, just tell your smart speaker, play c-span radio.