Human Generated Data

Title

Untitled (close-up view of woman sitting at table with cocktails at party)

Date

c. 1946

People

Artist: Jack Gould, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.14328

Human Generated Data

Title

Untitled (close-up view of woman sitting at table with cocktails at party)

People

Artist: Jack Gould, American

Date

c. 1946

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-29

Person 99.1
Human 99.1
Musician 98.8
Musical Instrument 98.8
Person 97.8
Tie 95.9
Accessories 95.9
Accessory 95.9
Person 95.8
Person 95.7
Person 92
Leisure Activities 84.6
Percussion 81.2
Drum 81.2
Drummer 76.7
Guitar 76.3
Guitarist 76.3
Performer 76.3
Music Band 63.9
Sunglasses 63.6
Person 62.7
Apparel 59.9
Clothing 59.9
People 57.2
Person 56.2
Pianist 55.8
Piano 55.8

Imagga
created on 2022-01-29

person 40
office 34.5
man 33.6
computer 33.6
people 33.5
adult 29.7
laptop 29.3
business 28.6
professional 27.7
working 26.5
work 25.1
male 23.4
sitting 23.2
desk 21.9
smiling 21
worker 20.6
businessman 20.3
happy 20.1
teacher 18
job 17.7
corporate 17.2
indoors 16.7
women 16.6
smile 16.4
businesswoman 16.4
room 15.7
technology 15.6
men 15.5
executive 15.1
student 14.8
education 14.7
casual 14.4
hairdresser 14.4
communication 14.3
classroom 14.2
keyboard 13.1
home 12.8
businesspeople 12.3
portrait 12.3
group 12.1
confident 11.8
chair 11.7
handsome 11.6
salon 11.5
table 11.4
meeting 11.3
modern 11.2
occupation 11
monitor 10.9
success 10.5
notebook 10.5
one 10.5
mature 10.2
successful 10.1
patient 10
looking 9.6
studying 9.6
career 9.5
adults 9.5
lifestyle 9.4
senior 9.4
equipment 9
suit 9
screen 9
employee 8.9
couple 8.7
class 8.7
talking 8.6
college 8.5
face 8.5
sit 8.5
specialist 8.5
pretty 8.4
attractive 8.4
manager 8.4
teamwork 8.3
indoor 8.2
cheerful 8.1
color 7.8
concentration 7.7
health 7.6
shop 7.6
educator 7.6
workplace 7.6
studio 7.6
learning 7.5
leisure 7.5
training 7.4
entrepreneur 7.3
team 7.2
to 7.1

Google
created on 2022-01-29

Microsoft
created on 2022-01-29

person 98.2
text 96.5
musical instrument 91.3
clothing 87.4
indoor 87.1
human face 71.6
drum 71.2
people 61.8
man 54.8
clothes 16.6

Face analysis

Amazon

Google

AWS Rekognition

Age 39-47
Gender Female, 99%
Happy 68.7%
Calm 13.6%
Fear 7.4%
Surprised 3.3%
Sad 2.6%
Disgusted 1.7%
Confused 1.5%
Angry 1.1%

AWS Rekognition

Age 29-39
Gender Female, 78.2%
Calm 93.7%
Happy 2.4%
Sad 2.3%
Surprised 0.5%
Angry 0.4%
Confused 0.4%
Disgusted 0.2%
Fear 0.2%

AWS Rekognition

Age 21-29
Gender Female, 99.4%
Calm 99.7%
Surprised 0.1%
Angry 0.1%
Disgusted 0%
Confused 0%
Fear 0%
Sad 0%
Happy 0%

AWS Rekognition

Age 23-33
Gender Female, 93.4%
Calm 91.2%
Sad 3.8%
Surprised 2.3%
Happy 0.9%
Confused 0.7%
Fear 0.5%
Disgusted 0.3%
Angry 0.3%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.1%
Tie 95.9%
Sunglasses 63.6%

Captions

Microsoft

a group of people posing for the camera 84.8%
a group of people posing for a photo 78.1%
a group of people posing for a picture 78%

Text analysis

Amazon

YT33A2
M 113
M 113 YT33A2 AREA
AREA

Google

MI13 YT33A2 ARA
YT33A2
ARA
MI13