Human Generated Data

Title

Untitled (family seated in living room with father reading aloud)

Date

1947

People

Artist: Orrion Barger, American active 1913 - 1984

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.6497

Human Generated Data

Title

Untitled (family seated in living room with father reading aloud)

People

Artist: Orrion Barger, American active 1913 - 1984

Date

1947

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.6497

Machine Generated Data

Tags

Amazon
created on 2019-03-22

Furniture 99.8
Human 99.4
Person 99.4
Person 99.2
Person 98.7
Person 97.7
Living Room 97.3
Indoors 97.3
Room 97.3
Couch 97
Chair 95
Fireplace 84.5
Tie 77.8
Accessories 77.8
Accessory 77.8
Hearth 74.9
Clothing 71.4
Apparel 71.4
Home Decor 64.9
People 62.2
Animal 58.7
Mammal 58.7
Canine 58.7
Waiting Room 57.7
Screen 55.5
Electronics 55.5

Clarifai
created on 2019-03-22

people 99.9
adult 98.5
group 97.8
furniture 97
leader 96.6
woman 96.2
man 96.2
group together 96.1
administration 94.8
sit 94.2
chair 93.3
two 91.7
room 91.1
several 90
seat 88.5
four 86.6
three 86
five 85.4
medical practitioner 84.4
facial expression 84

Imagga
created on 2019-03-22

kin 42.7
man 37.6
people 35.1
male 34
person 30.6
senior 26.2
adult 24.8
home 23.9
room 23.3
couple 22.6
sitting 21.5
men 21.5
grandfather 21.5
indoors 20.2
business 20
businessman 19.4
happy 18.2
mature 17.7
smiling 17.4
together 16.6
women 16.6
office 15.4
old 15.3
meeting 15.1
table 14.7
cheerful 13.8
group 13.7
portrait 13.6
teacher 13.5
executive 13.4
businesspeople 13.3
classroom 12.6
colleagues 12.6
happiness 12.5
elderly 12.4
talking 12.4
lifestyle 12.3
20s 11.9
family 11.6
30s 11.5
relaxed 11.3
laptop 11.1
casual 11
indoor 11
smile 10.7
retired 10.7
camera 10.2
team 9.9
older 9.7
mother 9.6
desk 9.4
friends 9.4
two 9.3
horizontal 9.2
businesswoman 9.1
grandma 9
professional 8.9
couch 8.7
love 8.7
educator 8.7
newspaper 8.7
retirement 8.6
four 8.6
world 8.6
face 8.5
color 8.3
spectator 8.1
patient 8.1
computer 8
interior 8
job 8
working 8
70s 7.9
day 7.8
corporate 7.7
married 7.7
twenties 7.6
husband 7.6
chair 7.6
friendship 7.5
pensioner 7.4
relaxing 7.3
suit 7.2
work 7.1

Google
created on 2019-03-22

Microsoft
created on 2019-03-22

indoor 90.3
old 46.7
older 24.3
person 24.3
family 11.3
ballet 7.8
black and white 4.5
boy 3.6

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 23-38
Gender Female, 54.2%
Sad 54%
Disgusted 45.3%
Angry 45.1%
Calm 45.2%
Happy 45%
Confused 45.2%
Surprised 45.1%

AWS Rekognition

Age 48-68
Gender Male, 54.2%
Calm 47.8%
Disgusted 45.1%
Confused 45.3%
Happy 45.1%
Sad 51%
Surprised 45.1%
Angry 45.6%

AWS Rekognition

Age 45-66
Gender Female, 54.5%
Sad 45.6%
Disgusted 45.1%
Angry 45.2%
Calm 53%
Happy 45.8%
Confused 45.1%
Surprised 45.2%

AWS Rekognition

Age 35-52
Gender Female, 54.7%
Disgusted 45%
Surprised 45%
Angry 45.1%
Sad 45.2%
Calm 54.5%
Confused 45.1%
Happy 45.1%

Feature analysis

Amazon

Person 99.4%
Couch 97%
Chair 95%
Tie 77.8%