Human Generated Data

Title

Untitled (man and woman shaking hands)

Date

1952

People

Artist: Peter James Studio, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.20177

Human Generated Data

Title

Untitled (man and woman shaking hands)

People

Artist: Peter James Studio, American

Date

1952

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-03-05

Apparel 99.8
Clothing 99.8
Person 99.7
Human 99.7
Chair 99.6
Furniture 99.6
Chair 94.6
Coat 91.7
Person 90.6
Overcoat 90.2
Suit 88.5
Sleeve 87
Shirt 75.7
Long Sleeve 67.9
Floor 64.1
Tuxedo 59.9
Door 59.6

Imagga
created on 2022-03-05

man 47.7
male 34
people 27.9
person 27.5
business 26.7
men 23.2
worker 23.1
job 20.3
indoors 20.2
patient 20.1
professional 19.3
businessman 18.5
shop 18.3
work 18
adult 17.8
working 17.7
barbershop 16.7
office 15.4
case 15
smiling 14.5
happy 14.4
building 14
manager 13
nurse 13
corporate 12.9
occupation 12.8
industry 12.8
urban 12.2
smile 12.1
group 12.1
women 11.9
casual 11.9
helmet 11.7
team 11.6
handsome 11.6
workplace 11.4
standing 11.3
suit 11.1
room 11
mercantile establishment 11
sick person 10.9
executive 10.9
lifestyle 10.8
clothing 10.7
career 10.4
adults 10.4
meeting 10.4
sitting 10.3
hairdresser 10.2
black 10.2
teamwork 10.2
engineer 10.1
employee 10.1
equipment 10
cleaner 10
city 10
medical 9.7
construction 9.4
doctor 9.4
two 9.3
safety 9.2
surgeon 9.2
successful 9.1
indoor 9.1
industrial 9.1
cheerful 8.9
hospital 8.9
life 8.9
success 8.8
architect 8.7
diversity 8.6
development 8.6
kitchen 8.4
clothes 8.4
modern 8.4
portrait 8.4
inside 8.3
student 8.1
looking 8
happiness 7.8
hands 7.8
employment 7.7
builder 7.6
window 7.5
human 7.5
one 7.5
place of business 7.4
foreman 7.2
to 7.1
interior 7.1
medicine 7

Google
created on 2022-03-05

Microsoft
created on 2022-03-05

person 99
man 98.4
text 93.2
clothing 92.5
standing 86
black and white 69.2
sport 67.2
male 15.5

Face analysis

Amazon

Google

AWS Rekognition

Age 47-53
Gender Male, 69.3%
Calm 99.8%
Surprised 0.1%
Sad 0.1%
Happy 0%
Disgusted 0%
Confused 0%
Angry 0%
Fear 0%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.7%
Chair 99.6%
Suit 88.5%

Captions

Microsoft

a man standing in front of a building 91.1%
a man standing next to a building 89.7%
a man that is standing in front of a building 87.3%

Text analysis

Amazon

EAD
KODAK-S.VEELA