Human Generated Data

Title

Untitled (luncheon, woman holding fishing rod)

Date

1951

People

Artist: Peter James Studio, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.20032

Human Generated Data

Title

Untitled (luncheon, woman holding fishing rod)

People

Artist: Peter James Studio, American

Date

1951

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-03-05

Human 99.3
Person 99.3
Person 99.2
Person 98.8
Musical Instrument 98.6
Musician 98.6
Person 96.7
Person 95.6
Sitting 90.9
Music Band 84.7
Leisure Activities 84.7
Guitarist 83.2
Performer 83.2
Guitar 83.2
Crowd 73.7
Person 67.7
Apparel 67.5
Suit 67.5
Overcoat 67.5
Clothing 67.5
Coat 67.5
Stage 65.5
Couch 55.5
Furniture 55.5
Pianist 55.2
Piano 55.2

Imagga
created on 2022-03-05

marimba 78.2
percussion instrument 75.7
musical instrument 62.4
man 44.3
male 36.9
people 36.2
person 33.1
table 28.5
office 27.5
business 26.7
adult 26.2
smiling 26
sitting 25.8
men 24
indoors 22.8
meeting 22.6
group 22.6
happy 22.6
businessman 22.1
work 22.1
room 21
home 20.7
laptop 20.2
couple 20
worker 20
executive 19.9
together 19.3
teacher 19
team 17.9
senior 17.8
professional 17.2
desk 17
teamwork 16.7
women 16.6
colleagues 16.5
talking 16.2
mature 14.9
education 14.7
cheerful 14.6
businesswoman 14.5
computer 14.5
looking 14.4
businesspeople 14.2
vibraphone 14.1
corporate 13.7
smile 13.5
modern 13.3
happiness 13.3
communication 12.6
waiter 12.5
interior 12.4
working 12.4
chair 12.3
classroom 12
suit 11.7
job 11.5
employee 10.7
workplace 10.5
student 10.3
lifestyle 10.1
indoor 10
restaurant 9.9
holding 9.9
conference 9.8
teaching 9.7
mid adult 9.6
leader 9.6
drinking 9.6
study 9.3
wine 9.2
occupation 9.2
successful 9.1
hand 9.1
confident 9.1
portrait 9.1
handsome 8.9
medical 8.8
40s 8.8
discussion 8.8
two 8.5
doctor 8.5
manager 8.4
presentation 8.4
drink 8.3
coffee 8.3
technology 8.2
board 8.1
meal 8.1
kitchen 8
to 8
coworkers 7.9
two people 7.8
busy 7.7
30s 7.7
elderly 7.7
device 7.6
friends 7.5
human 7.5
life 7.5
patient 7.2
food 7.2
love 7.1

Microsoft
created on 2022-03-05

person 99.8
man 91.9
clothing 89.7
concert 87.2
text 64.2
human face 57.6
musical instrument 57.2

Face analysis

Amazon

Google

AWS Rekognition

Age 25-35
Gender Male, 64.8%
Calm 36.1%
Happy 33.3%
Disgusted 8.5%
Surprised 7.6%
Angry 6%
Sad 3.9%
Confused 3.3%
Fear 1.3%

AWS Rekognition

Age 43-51
Gender Male, 99.8%
Happy 82.2%
Surprised 3.7%
Calm 3.3%
Sad 3.3%
Disgusted 3.2%
Angry 2.8%
Confused 0.7%
Fear 0.7%

AWS Rekognition

Age 31-41
Gender Male, 99.9%
Calm 71.9%
Surprised 16.6%
Disgusted 2.3%
Happy 2.1%
Angry 2%
Sad 1.8%
Confused 1.7%
Fear 1.5%

AWS Rekognition

Age 7-17
Gender Male, 99.4%
Fear 42.1%
Calm 21.9%
Sad 9.5%
Surprised 7.7%
Confused 7.5%
Disgusted 4.9%
Happy 4%
Angry 2.3%

AWS Rekognition

Age 9-17
Gender Male, 95.8%
Disgusted 41.7%
Calm 30.3%
Sad 9.4%
Angry 9.2%
Fear 3%
Confused 2.7%
Happy 1.9%
Surprised 1.7%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.3%

Captions

Microsoft

a group of people sitting at a table 93.7%
a group of people performing on a counter 93.6%
a group of people sitting around a table 93.5%

Text analysis

Amazon

BEANS
KODAK-SVEELA