Human Generated Data

Title

Untitled (women's group, Quota Club)

Date

1939

People

Artist: Harris & Ewing, American 1910s-1940s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.22279

Human Generated Data

Title

Untitled (women's group, Quota Club)

People

Artist: Harris & Ewing, American 1910s-1940s

Date

1939

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.22279

Machine Generated Data

Tags

Amazon
created on 2022-03-11

Person 99.4
Human 99.4
Person 99.1
Person 98.5
Person 98.4
Person 98.4
Person 97.7
Person 97.6
Person 96.7
Clinic 91.7
Person 90.7
Room 87.3
Indoors 87.3
Person 80.3
Table 75.6
Furniture 75.6
Symbol 69.4
Interior Design 67.8
Court 67.7
Hospital 65.6
People 64.6
Flag 59.2
Crowd 59.1
Nurse 58.5
Workshop 58.4
Desk 57.7
Operating Theatre 57.7

Clarifai
created on 2023-10-22

people 99.6
group 99.3
administration 97.9
man 97.5
leader 97.4
adult 97.3
furniture 96.1
many 95.7
group together 95.4
war 92.9
room 92.4
sit 92.1
woman 91
several 89.3
military 88.1
meeting 87.4
indoors 87.2
desk 87.1
chair 84.2
sitting 82.2

Imagga
created on 2022-03-11

barbershop 59.5
shop 51.4
mercantile establishment 38.6
nurse 36.3
person 35.8
people 35.7
male 31.9
man 31.6
professional 28.3
salon 26.6
adult 26.3
place of business 25.8
room 25.1
office 24.4
work 23.5
sitting 22.3
happy 21.3
patient 21
table 20.8
worker 20.7
business 20
medical 19.4
businessman 19.4
home 19.1
desk 18.9
computer 18.4
smiling 18.1
men 17.2
working 15.9
indoors 15.8
women 15.8
indoor 15.5
portrait 14.9
team 14.3
smile 14.2
businesspeople 14.2
meeting 14.1
doctor 14.1
mature 13.9
coat 13.7
job 13.3
teacher 13.3
interior 13.3
clinic 13.1
group 12.9
establishment 12.9
businesswoman 12.7
medicine 12.3
couple 12.2
teamwork 12
hospital 12
two 11.9
health 11.8
happiness 11.8
assistant 11.7
laboratory 11.6
together 11.4
senior 11.2
modern 11.2
chair 11.1
casual 11
laptop 10.9
lab 10.7
colleagues 10.7
cheerful 10.6
research 10.5
talking 10.5
lifestyle 10.1
scientist 9.8
test 9.6
education 9.5
biology 9.5
mother 9.4
executive 9.4
instrument 9.3
occupation 9.2
life 9.1
student 9.1
classroom 9
technology 8.9
looking 8.8
conference 8.8
discussion 8.8
chemistry 8.7
chemical 8.7
30s 8.7
exam 8.6
corporate 8.6
sit 8.5
communication 8.4
attractive 8.4
study 8.4
20s 8.2
board 8.1
suit 8.1
case 8
family 8
associates 7.9
face 7.8
scientific 7.7
check 7.7
pretty 7.7
profession 7.7
human 7.5
holding 7.4
care 7.4
kitchen 7.2

Google
created on 2022-03-11

Microsoft
created on 2022-03-11

person 97.1
text 95.4
man 95
indoor 87.8
clothing 86.1
table 60.9
posing 39.3

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 48-54
Gender Male, 99.5%
Calm 68%
Sad 12.8%
Happy 7.6%
Confused 4.9%
Surprised 2.5%
Angry 1.9%
Disgusted 1.4%
Fear 0.7%

AWS Rekognition

Age 45-53
Gender Male, 100%
Happy 97.9%
Calm 0.7%
Surprised 0.6%
Confused 0.3%
Sad 0.2%
Angry 0.2%
Disgusted 0.1%
Fear 0.1%

AWS Rekognition

Age 48-56
Gender Male, 92.7%
Happy 92%
Sad 4.2%
Confused 1%
Surprised 1%
Calm 0.7%
Disgusted 0.4%
Fear 0.4%
Angry 0.3%

AWS Rekognition

Age 47-53
Gender Male, 99.5%
Surprised 70.6%
Sad 12%
Calm 9.8%
Fear 2.1%
Happy 1.9%
Confused 1.7%
Disgusted 1.2%
Angry 0.7%

AWS Rekognition

Age 23-31
Gender Female, 90.6%
Surprised 53.8%
Calm 22.8%
Happy 10.5%
Fear 8.3%
Sad 2.3%
Disgusted 1.3%
Confused 0.7%
Angry 0.3%

AWS Rekognition

Age 38-46
Gender Male, 86.7%
Calm 88.9%
Happy 3.6%
Confused 3.5%
Sad 1.3%
Surprised 1.2%
Disgusted 0.6%
Angry 0.5%
Fear 0.4%

AWS Rekognition

Age 51-59
Gender Female, 84.4%
Calm 91.5%
Confused 2.9%
Sad 2.8%
Happy 0.8%
Surprised 0.7%
Angry 0.7%
Disgusted 0.4%
Fear 0.2%

AWS Rekognition

Age 26-36
Gender Male, 85.1%
Surprised 86.7%
Happy 9%
Fear 2.3%
Calm 1%
Sad 0.3%
Angry 0.3%
Disgusted 0.3%
Confused 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Possible
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Person 99.4%
Person 99.1%
Person 98.5%
Person 98.4%
Person 98.4%
Person 97.7%
Person 97.6%
Person 96.7%
Person 90.7%
Person 80.3%

Categories

Text analysis

Amazon

the