Human Generated Data

Title

Untitled (four people gathered around juke box choosing songs)

Date

c. 1950-1960

People

Artist: Claseman Studio, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.11076

Human Generated Data

Title

Untitled (four people gathered around juke box choosing songs)

People

Artist: Claseman Studio, American 20th century

Date

c. 1950-1960

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.11076

Machine Generated Data

Tags

Amazon
created on 2019-03-25

Human 98.2
Person 98.2
Person 97.9
Indoors 92.5
Room 92.4
Person 83.2
Clothing 75.7
Apparel 75.7
Bedroom 66.2
Advertisement 65.4
Poster 64.9
Female 63.2
Person 62.5
Collage 61.4
Dressing Room 59.7
Sleeve 57.5
Dorm Room 56.4

Clarifai
created on 2019-03-25

people 99.5
man 98
group 97.1
veil 96.8
adult 95.9
group together 95.4
woman 93.5
two 88.6
wedding 87.2
wear 86.2
indoors 85.6
three 83.2
monochrome 82.7
ceremony 80.3
leader 79.7
child 78.4
actor 76.2
music 75.5
five 72.9
outfit 72.8

Imagga
created on 2019-03-25

man 34.2
senior 30
male 28.4
person 28.3
negative 27.5
people 26.2
surgeon 23.2
old 23
elderly 22
film 21.6
adult 21
mature 20.4
retired 19.4
retirement 17.3
home 16.7
photographic paper 16.7
newspaper 16
indoors 15.8
happy 15.7
men 14.6
computer 14.4
casual 14.4
medical 14.1
patient 14.1
together 14
world 14
couple 13.9
portrait 13.6
businessman 13.2
sitting 12.9
looking 12.8
face 12.8
health 12.5
lifestyle 12.3
laptop 11.8
aged 11.8
product 11.5
age 11.4
doctor 11.3
pensioner 11.2
professional 11.2
photographic equipment 11.1
work 11
day 11
business 10.9
horizontal 10.9
gray 10.8
older 10.7
hospital 10.5
specialist 10.5
camera 10.2
smiling 10.1
hand 9.9
70s 9.8
60s 9.8
room 9.7
clothing 9.7
nurse 9.6
illness 9.5
hair 9.5
love 9.5
creation 9.3
occupation 9.2
holding 9.1
handsome 8.9
color 8.9
to 8.8
40s 8.8
daily 8.8
profession 8.6
architecture 8.6
human 8.2
cheerful 8.1
team 8.1
gray hair 7.9
happiness 7.8
statue 7.7
office 7.7
planner 7.7
teacher 7.5
meeting 7.5
leisure 7.5
technology 7.4
care 7.4
indoor 7.3
women 7.1
working 7.1
look 7

Google
created on 2019-03-25

Microsoft
created on 2019-03-25

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 26-43
Gender Male, 77.1%
Angry 12.5%
Calm 17.1%
Happy 36.4%
Surprised 4%
Sad 22.9%
Disgusted 2.5%
Confused 4.7%

AWS Rekognition

Age 26-43
Gender Female, 90.3%
Sad 5%
Calm 6%
Happy 2.4%
Surprised 1.8%
Angry 3.6%
Disgusted 78.9%
Confused 2.3%

AWS Rekognition

Age 26-43
Gender Male, 69.1%
Angry 4.4%
Sad 40.3%
Surprised 1.8%
Confused 2.6%
Calm 46.1%
Happy 4.3%
Disgusted 0.5%

AWS Rekognition

Age 26-43
Gender Female, 51.3%
Happy 94.8%
Confused 0.9%
Calm 1%
Sad 2.1%
Disgusted 0.1%
Angry 0.5%
Surprised 0.6%

AWS Rekognition

Age 14-25
Gender Female, 54.7%
Calm 46.3%
Sad 50.6%
Disgusted 45.3%
Happy 45.7%
Confused 45.6%
Angry 46.2%
Surprised 45.3%

Feature analysis

Amazon

Person 98.2%