Human Generated Data

Title

Untitled (men in Tide magazine office)

Date

1951

People

Artist: Peter James Studio, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.20156

Human Generated Data

Title

Untitled (men in Tide magazine office)

People

Artist: Peter James Studio, American

Date

1951

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-03-05

Person 99.6
Human 99.6
Person 99.4
Person 98.7
Apparel 91.8
Clothing 91.8
Sailor Suit 91.6
Person 90.6
Clinic 82.8
Shop 81.1
Sleeve 77.1
Coat 70.9
Hand 70.4
Person 67.8
Flooring 66.4
Suit 65.6
Overcoat 65.6
Long Sleeve 64.4
Hospital 59.5
Operating Theatre 56.9
Military 55.4

Imagga
created on 2022-03-05

supermarket 48.2
grocery store 38.3
marketplace 28.6
people 26.8
man 25.5
boutique 25.3
mercantile establishment 25.1
adult 23.3
business 21.9
brass 21.7
trombone 21.4
women 18.2
male 17
professional 17
wind instrument 16.7
work 16.5
building 16.3
team 16.1
hospital 15.5
men 15.4
indoors 14.9
urban 14.8
modern 14.7
person 14.6
medical 14.1
place of business 13.6
shop 13.2
health 13.2
city 12.5
working 12.4
businessman 12.4
medicine 12.3
walking 12.3
patient 12.3
group 12.1
industry 11.9
clothing 11.5
interior 11.5
surgeon 11.4
doctor 11.3
happy 11.3
nurse 11.2
teamwork 11.1
musical instrument 11.1
worker 11
businesswoman 10.9
corporate 10.3
lifestyle 10.1
occupation 10.1
attractive 9.8
groom 9.7
station 9.7
couple 9.6
blurred 9.6
room 9.3
industrial 9.1
dress 9
transportation 9
job 8.8
train 8.8
life 8.8
shoe shop 8.8
hands 8.7
factory 8.7
smiling 8.7
crowd 8.6
architecture 8.6
window 8.4
equipment 8.4
inside 8.3
human 8.2
suit 8.1
success 8
day 7.8
smile 7.8
happiness 7.8
standing 7.8
surgery 7.8
travel 7.7
pretty 7.7
casual 7.6
engineering 7.6
fashion 7.5
clothes 7.5
holding 7.4
care 7.4
wedding 7.4
gate 7.3
black 7.2
uniform 7.1

Google
created on 2022-03-05

Microsoft
created on 2022-03-05

person 97.7
clothing 92.6
text 88.4
man 86.6
black and white 75.8
footwear 52.2

Face analysis

Amazon

Google

AWS Rekognition

Age 29-39
Gender Male, 83.2%
Calm 98.1%
Surprised 0.4%
Confused 0.3%
Disgusted 0.3%
Angry 0.3%
Sad 0.3%
Happy 0.3%
Fear 0.1%

AWS Rekognition

Age 22-30
Gender Male, 89.8%
Calm 99%
Sad 0.9%
Surprised 0%
Confused 0%
Disgusted 0%
Angry 0%
Happy 0%
Fear 0%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.6%

Captions

Microsoft

a group of people standing in a room 92.9%
a group of people in a room 92.3%
a group of people standing around each other 81.3%

Text analysis

Amazon

ODI
KODAR-SEELA
900

Google

GLATS
GLATS