Human Generated Data

Title

Untitled (two men sitting on inflatable toys)

Date

1951

People

Artist: Peter James Studio, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.20244

Human Generated Data

Title

Untitled (two men sitting on inflatable toys)

People

Artist: Peter James Studio, American

Date

1951

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-03-05

Human 98.7
Person 98.7
Person 98.5
Musical Instrument 85.1
Musician 85.1
Tie 84.3
Accessories 84.3
Accessory 84.3
Clothing 83.3
Apparel 83.3
Face 77.5
Chair 76.9
Furniture 76.9
Floor 75.2
Brass Section 74.6
Horn 74.6
Sleeve 74.6
Photography 68.8
Photo 68.8
Portrait 68.8
Long Sleeve 66
Leisure Activities 64.3
Food 58.8
Dish 58.8
Meal 58.8
Music Band 57.5
Trumpet 56.9
Cornet 56.9
Flooring 56.4
Saxophone 55.6
Tie 51.8

Imagga
created on 2022-03-05

sax 54.6
brass 52.4
wind instrument 46.8
cornet 29.1
musical instrument 25.2
man 22.2
horn 20.4
male 19.2
device 18.4
people 17.8
adult 16.2
bride 15.9
person 15.5
couple 14.8
wedding 14.7
groom 13.5
glass 13.3
medical 12.3
suit 12.2
instrumentality 11.7
portrait 11.6
music 11.1
love 11
two 11
decoration 11
ceremony 10.7
laboratory 10.6
old 10.4
biology 10.4
table 10.4
celebration 10.4
men 10.3
attractive 9.8
businessman 9.7
black 9.6
marriage 9.5
drink 9.2
modern 9.1
dress 9
musician 8.9
medicine 8.8
happy 8.8
instrument 8.7
chemistry 8.7
chemical 8.7
clothing 8.6
party 8.6
husband 8.6
luxury 8.6
wife 8.5
business 8.5
flower 8.4
bottle 8.3
health 8.3
care 8.2
gold 8.2
playing 8.2
cheerful 8.1
worker 8
romance 8
office 8
interior 8
smiling 7.9
indoors 7.9
work 7.8
happiness 7.8
artifact 7.8
lab 7.8
professional 7.8
bouquet 7.7
sitting 7.7
test 7.7
research 7.6
fashion 7.5
human 7.5
women 7.1
antique 7.1

Google
created on 2022-03-05

Microsoft
created on 2022-03-05

wall 96.6
black and white 80.4
text 79.4
person 58.2

Face analysis

Amazon

Google

AWS Rekognition

Age 45-51
Gender Female, 75.8%
Calm 64.6%
Sad 32.9%
Angry 0.6%
Happy 0.5%
Confused 0.4%
Disgusted 0.4%
Fear 0.3%
Surprised 0.3%

AWS Rekognition

Age 45-51
Gender Male, 100%
Happy 98.7%
Confused 0.4%
Surprised 0.4%
Angry 0.2%
Disgusted 0.1%
Calm 0.1%
Fear 0.1%
Sad 0%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 98.7%
Tie 84.3%

Captions

Microsoft

a person standing in front of a mirror posing for the camera 79.1%
a man and woman standing in front of a mirror posing for the camera 61.1%
a person that is standing in front of a mirror posing for the camera 61%

Text analysis

Amazon

CAP