Human Generated Data

Title

Untitled (family of four seated on couch)

Date

c. 1950

People

Artist: Lainson Studios,

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.21864

Human Generated Data

Title

Untitled (family of four seated on couch)

People

Artist: Lainson Studios,

Date

c. 1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.21864

Machine Generated Data

Tags

Amazon
created on 2023-10-25

Lamp 100
Table Lamp 100
Architecture 100
Building 100
Furniture 100
Indoors 100
Living Room 100
Room 100
Couch 100
Adult 99.2
Male 99.2
Man 99.2
Person 99.2
Person 98.9
Person 98.2
Animal 98
Canine 98
Mammal 98
Pet 98
Puppy 98
Male 97.6
Person 97.6
Boy 97.6
Child 97.6
Dog 97
Face 96.5
Head 96.5
Photography 96.5
Portrait 96.5
Clothing 91.6
Formal Wear 91.6
Suit 91.6
Person 90.5
Suit 87.2
Lampshade 82.2
People 81.1
Coat 76.9
Footwear 74.8
Shoe 74.8
Cat 69.8
Suit 63
Shoe 61.5
Chair 57.4
Interior Design 56.2
Terrier 55.9
Table 55.3
Lighting 55.1

Clarifai
created on 2018-08-23

people 99.1
group 97.4
woman 95.3
sit 94.1
group together 93.6
adult 93.3
leader 93
man 92.9
administration 91.6
chair 90.6
family 87.7
actress 84.5
room 83.9
facial expression 83.9
child 83.4
furniture 83.1
portrait 78.8
many 72.2
indoors 70.6
ceremony 69

Imagga
created on 2018-08-23

kin 57.1
man 41.7
people 35.1
male 30.6
couple 28.7
person 27.4
adult 24.7
home 23.1
child 22
happy 21.9
business 20
love 19.7
room 19.4
sitting 18.9
smiling 18.8
together 18.4
men 18
family 17.8
businessman 17.6
professional 15.5
indoors 14.9
laptop 14.7
indoor 14.6
couch 14.5
lifestyle 14.4
women 14.2
boy 13.9
cheerful 13.8
mother 13.4
computer 12.9
group 12.9
office 12.8
happiness 12.5
house 12.5
friends 12.2
senior 12.2
executive 11.9
casual 11.9
leisure 11.6
grandfather 11.5
holding 10.7
handsome 10.7
kid 10.6
interior 10.6
meeting 10.4
teacher 10.3
work 10.2
relax 10.1
suit 9.9
job 9.7
portrait 9.7
looking 9.6
loving 9.5
marriage 9.5
smile 9.3
student 9.2
worker 8.9
husband 8.7
elderly 8.6
sofa 8.6
reading 8.6
face 8.5
two 8.5
relationship 8.4
friendship 8.4
attractive 8.4
life 8.4
father 8.2
team 8.1
night 8
to 8
working 7.9
chair 7.8
education 7.8
corporate 7.7
grandma 7.7
comfort 7.7
expression 7.7
married 7.7
youth 7.7
old 7.7
employee 7.6
tie 7.6
living 7.6
parent 7.6
communication 7.6
fun 7.5
technology 7.4
teamwork 7.4
wine 7.4
lady 7.3
holiday 7.2
romance 7.1

Google
created on 2018-08-23

Microsoft
created on 2018-08-23

person 99.8
wall 99.2
indoor 97.1
posing 84.8
group 80.4
people 69.6

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 18-26
Gender Female, 94%
Calm 98.9%
Surprised 6.3%
Fear 5.9%
Sad 2.2%
Angry 0.4%
Confused 0.2%
Happy 0.1%
Disgusted 0%

AWS Rekognition

Age 35-43
Gender Male, 100%
Calm 99.2%
Surprised 6.4%
Fear 5.9%
Sad 2.2%
Happy 0.1%
Confused 0.1%
Angry 0.1%
Disgusted 0%

AWS Rekognition

Age 9-17
Gender Male, 100%
Calm 98.6%
Surprised 6.3%
Fear 5.9%
Sad 2.2%
Happy 0.4%
Confused 0.3%
Angry 0.3%
Disgusted 0.1%

AWS Rekognition

Age 41-49
Gender Female, 100%
Happy 97.4%
Surprised 6.7%
Fear 6%
Sad 2.2%
Calm 0.6%
Confused 0.3%
Angry 0.3%
Disgusted 0.1%

Microsoft Cognitive Services

Age 35
Gender Female

Microsoft Cognitive Services

Age 45
Gender Male

Microsoft Cognitive Services

Age 29
Gender Female

Microsoft Cognitive Services

Age 9
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very likely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Adult 99.2%
Male 99.2%
Man 99.2%
Person 99.2%
Boy 97.6%
Child 97.6%
Dog 97%
Suit 91.6%
Coat 76.9%
Shoe 74.8%
Cat 69.8%

Text analysis

Amazon

JOON
NOSNIT
ABAN30