Human Generated Data

Title

Untitled (portrait of family in living room)

Date

c. 1950

People

Artist: Lucian and Mary Brown, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.16607

Human Generated Data

Title

Untitled (portrait of family in living room)

People

Artist: Lucian and Mary Brown, American

Date

c. 1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.16607

Machine Generated Data

Tags

Amazon
created on 2022-02-12

Person 99.6
Human 99.6
Person 99.3
Person 98.7
Person 98.6
Furniture 96.6
Shoe 93
Footwear 93
Clothing 93
Apparel 93
Person 92.9
Tie 87.7
Accessories 87.7
Accessory 87.7
Chair 86.9
People 83.7
Sitting 81.7
Shorts 78.7
Face 77.9
Couch 75.5
Living Room 72.8
Room 72.8
Indoors 72.8
Monitor 69.6
Display 69.6
Screen 69.6
Electronics 69.6
Female 68.5
Suit 67
Coat 67
Overcoat 67
Portrait 65.1
Photography 65.1
Photo 65.1
Baby 57
LCD Screen 55.9
Kid 55.3
Child 55.3

Clarifai
created on 2023-10-29

people 99.9
group 99.6
group together 98.7
man 98.2
adult 97.9
woman 95.3
several 91.6
three 91.3
child 90.2
many 89.9
wear 88.7
leader 87.5
four 84.4
five 81.9
uniform 80
medical practitioner 78.7
actor 78.6
elderly 77.8
retro 77.4
boy 75.2

Imagga
created on 2022-02-12

kin 63.2
people 34
man 23.6
male 22.8
men 22.3
group 21.8
person 20.5
silhouette 19.9
couple 17.4
adult 17.4
team 16.1
women 15.8
sport 14.2
friends 14.1
businessman 13.2
bride 13
business 12.8
happy 12.5
family 12.5
portrait 12.3
brass 12
black 12
happiness 11.8
crowd 11.5
friendship 11.2
life 11.2
teamwork 11.1
dress 10.8
child 10.5
fun 10.5
mother 10.3
groom 10.3
love 10.3
youth 10.2
world 10.1
wedding 10.1
wind instrument 10.1
girls 10
suit 9.9
active 9.9
fashion 9.8
together 9.6
party 9.5
smiling 9.4
nurse 9.3
two 9.3
exercise 9.1
human 9
art 8.9
celebration 8.8
silhouettes 8.7
boy 8.7
professional 8.3
leisure 8.3
room 8.1
lifestyle 7.9
urban 7.9
standing 7.8
run 7.7
motion 7.7
outdoor 7.6
walking 7.6
joy 7.5
light 7.3
musical instrument 7.3
decoration 7.3
cheerful 7.3
body 7.2
home 7.2
summer 7.1
work 7.1

Google
created on 2022-02-12

Microsoft
created on 2022-02-12

text 97.2
person 86
clothing 85.4
posing 60.6

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 34-42
Gender Male, 96.6%
Happy 46.9%
Calm 32.3%
Sad 10.6%
Surprised 6.1%
Angry 1.6%
Confused 1.2%
Disgusted 0.8%
Fear 0.5%

AWS Rekognition

Age 39-47
Gender Male, 81.8%
Calm 74.7%
Surprised 11.7%
Sad 7.4%
Happy 3.6%
Confused 1%
Disgusted 0.6%
Angry 0.5%
Fear 0.4%

AWS Rekognition

Age 37-45
Gender Male, 99.7%
Calm 70.5%
Fear 7.2%
Surprised 7.2%
Angry 5.6%
Sad 4%
Disgusted 2.7%
Confused 1.8%
Happy 1%

AWS Rekognition

Age 35-43
Gender Male, 99.9%
Sad 63%
Calm 21.4%
Surprised 4.3%
Confused 3.6%
Disgusted 2.4%
Happy 2.3%
Angry 2.3%
Fear 0.7%

AWS Rekognition

Age 45-53
Gender Male, 100%
Happy 54.4%
Confused 18.6%
Calm 13.2%
Sad 3.8%
Fear 3.5%
Surprised 2.8%
Angry 2.6%
Disgusted 1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Shoe
Tie
Chair
Person 99.6%
Person 99.3%
Person 98.7%
Person 98.6%
Person 92.9%
Shoe 93%
Tie 87.7%
Chair 86.9%

Categories

Imagga

paintings art 87.8%
people portraits 11.8%

Text analysis

Amazon

was
M 113 was
M 113