Human Generated Data

Title

Untitled (two men holding large key)

Date

1950

People

Artist: Peter James Studio, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.20165

Human Generated Data

Title

Untitled (two men holding large key)

People

Artist: Peter James Studio, American

Date

1950

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-03-05

Person 97.7
Human 97.7
Clothing 94.2
Apparel 94.2
Person 93.9
Electrical Device 78.3
Microphone 78.3
Chair 77.4
Furniture 77.4
Coat 75.4
Crowd 73.5
Person 70
Clinic 69.6
Suit 68.6
Overcoat 68.6
Person 61.9
Musical Instrument 59.9
Musician 59.9
Scientist 57.6
Lab Coat 57
Shirt 56.7

Imagga
created on 2022-03-05

man 38.3
person 36
male 32.7
life 31.3
business 28.5
people 27.3
musical instrument 25.8
businessman 25.6
wind instrument 23.5
brass 23.1
adult 23.1
job 22.1
happy 21.9
office 21.7
professional 21.1
smiling 19.5
work 18
corporate 18
men 18
handsome 16
indoors 15.8
couple 15.7
smile 15.7
executive 15.1
businesswoman 14.5
suit 14.4
holding 14
cornet 14
teacher 13.2
portrait 12.9
group 12.9
sitting 12.9
success 12.9
room 12.8
attractive 12.6
lifestyle 12.3
computer 12
looking 12
confident 11.8
worker 11.8
communication 11.8
working 11.5
businesspeople 11.4
meeting 11.3
home 11.2
laptop 10.9
team 10.7
interior 10.6
pretty 10.5
occupation 10.1
successful 10.1
device 9.8
medical 9.7
chair 9.5
career 9.5
clothing 9.5
desk 9.4
casual 9.3
teamwork 9.3
indoor 9.1
fashion 9
one 9
cheerful 8.9
education 8.7
table 8.7
happiness 8.6
tie 8.5
two 8.5
doctor 8.5
modern 8.4
mature 8.4
color 8.3
coat 8.3
health 8.3
waiter 8
women 7.9
standing 7.8
colleagues 7.8
thinking 7.6
employee 7.5
fun 7.5
day 7.1
domestic 7
together 7

Google
created on 2022-03-05

Microsoft
created on 2022-03-05

wall 99.2
indoor 96.7
text 96.7
person 91.9
clothing 90.6
man 82.1
black and white 58.8
old 56.6
cluttered 15.8

Face analysis

Amazon

Google

AWS Rekognition

Age 54-64
Gender Male, 99.9%
Calm 73%
Surprised 9.6%
Confused 6.4%
Sad 3.1%
Disgusted 3%
Happy 2.8%
Fear 1%
Angry 1%

AWS Rekognition

Age 50-58
Gender Male, 99.9%
Calm 67.5%
Confused 19.7%
Sad 7.2%
Surprised 2.5%
Disgusted 1.1%
Fear 1%
Happy 0.6%
Angry 0.5%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 97.7%
Microphone 78.3%
Chair 77.4%

Captions

Microsoft

an old photo of a person cooking in a kitchen 52.1%
an old photo of a person in a kitchen 52%
old photo of a person in a kitchen 51.9%

Text analysis

Amazon

YAL
SID
YТ37А-MX
creatos

Google

SIA
SIA YT37A2-XAGON
YT37A2-XAGON