Human Generated Data

Title

Untitled (group of African American women seated and listening to cooking demonstration)

Date

c. 1945

People

Artist: Harry Annas, American 1897 - 1980

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.3495

Human Generated Data

Title

Untitled (group of African American women seated and listening to cooking demonstration)

People

Artist: Harry Annas, American 1897 - 1980

Date

c. 1945

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-22

Person 99.7
Human 99.7
Person 99.6
Person 99.2
Person 96
Person 95.3
Person 95.1
Person 91.3
Restaurant 88.3
Person 88
Cafeteria 73.6
Bowl 72.2
Person 72.2
Meal 71.5
Food 71.5
Person 71.4
Apparel 66.2
Clothing 66.2
Indoors 61.5
Text 61.2
Electronics 60.4
Screen 60.4
Room 59.5
Sitting 58.7
Monitor 55.2
Display 55.2

Imagga
created on 2022-01-22

monitor 44.4
x-ray film 34.4
computer 30.5
business 29.1
office 29.1
film 28.1
man 27.5
people 26.8
equipment 25.6
businessman 22.1
laptop 22
photographic paper 21
working 20.3
television 20.1
male 19.8
person 18.9
electronic equipment 18.8
desk 17.9
meeting 16.9
communication 16.8
professional 16.6
work 16.5
room 15.4
group 15.3
corporate 14.6
sitting 14.6
men 14.6
adult 14.5
screen 14.5
job 14.1
photographic equipment 14
indoor 13.7
interior 13.3
education 13
hand 12.9
executive 12.9
table 12.8
technology 12.6
barbershop 12.3
phone 12
shop 11.9
display 11.5
indoors 11.4
occupation 11
modern 10.5
window 10.4
keyboard 10.3
finance 10.1
telecommunication system 10.1
silhouette 9.9
worker 9.8
chair 9.6
happy 9.4
manager 9.3
teamwork 9.3
businesswoman 9.1
black 9
team 9
handsome 8.9
information 8.8
design 8.4
board 8.1
success 8
classroom 8
light 8
looking 8
teacher 7.9
mercantile establishment 7.9
women 7.9
smile 7.8
notebook 7.8
consultant 7.8
gesture 7.6
businesspeople 7.6
horizontal 7.5
blackboard 7.5
showing 7.5
one 7.5
back 7.3
digital 7.3
smiling 7.2
to 7.1

Google
created on 2022-01-22

Microsoft
created on 2022-01-22

text 99.7
window 91.1
person 90.9
clothing 87.4
screenshot 81.6
woman 64.7
picture frame 14.6

Face analysis

Amazon

Google

AWS Rekognition

Age 2-10
Gender Male, 55.6%
Sad 79.2%
Calm 10.9%
Surprised 3%
Happy 2.3%
Fear 2%
Angry 1.3%
Confused 0.8%
Disgusted 0.5%

AWS Rekognition

Age 28-38
Gender Female, 97.8%
Calm 42.3%
Happy 25.6%
Sad 11.8%
Fear 10.5%
Confused 6.1%
Surprised 1.4%
Disgusted 1.3%
Angry 1%

AWS Rekognition

Age 23-33
Gender Female, 95.2%
Calm 35.5%
Surprised 25.7%
Sad 20%
Confused 7.3%
Fear 5.2%
Disgusted 4.2%
Angry 1.3%
Happy 0.7%

AWS Rekognition

Age 23-33
Gender Female, 88.1%
Calm 92.5%
Happy 2.1%
Sad 1.2%
Angry 1.1%
Disgusted 1%
Confused 1%
Surprised 0.6%
Fear 0.5%

AWS Rekognition

Age 12-20
Gender Male, 61.6%
Calm 84.3%
Confused 7.8%
Sad 2.9%
Happy 2.7%
Fear 1.3%
Disgusted 0.4%
Surprised 0.3%
Angry 0.3%

AWS Rekognition

Age 29-39
Gender Male, 96.1%
Confused 34.2%
Calm 29.3%
Sad 25.8%
Happy 4.3%
Fear 2%
Surprised 2%
Disgusted 1.2%
Angry 1.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very likely

Feature analysis

Amazon

Person 99.7%

Captions

Microsoft

a group of people standing in front of a window 71.1%
a group of people in front of a window 68.4%
a group of people standing next to a window 68.3%

Text analysis

Amazon

B