Human Generated Data

Title

Untitled (family portrait in living room)

Date

c. 1950

People

Artist: Lainson Studios,

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.21884

Human Generated Data

Title

Untitled (family portrait in living room)

People

Artist: Lainson Studios,

Date

c. 1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.21884

Machine Generated Data

Tags

Amazon
created on 2023-10-25

Lamp 100
Table Lamp 99.9
Architecture 99.9
Building 99.9
Furniture 99.9
Indoors 99.9
Living Room 99.9
Room 99.9
Person 99.2
Person 99.1
Adult 99.1
Male 99.1
Man 99.1
Person 99
Adult 99
Male 99
Man 99
Person 98.4
Adult 98.4
Male 98.4
Man 98.4
Couch 97.2
People 97.1
Animal 95.9
Canine 95.9
Dog 95.9
Mammal 95.9
Pet 95.9
Formal Wear 94.2
Hound 92.3
Clothing 88.6
Suit 88.6
Face 87.9
Head 87.9
Interior Design 84.9
Suit 77
Fireplace 67.1
Chair 62.7
Photography 60.5
Portrait 60.5
Coat 60.2
Chandelier 57.7
Dining Room 57.3
Dining Table 57.3
Table 57.3
Lampshade 57.3
Dress 57
Puppy 56.7
Beagle 55.9
Bulldog 55.7
Armchair 55.4
Cabinet 55.3
Tuxedo 55.1

Clarifai
created on 2018-08-23

people 99.9
group 99
adult 96.9
woman 96.4
room 96.2
administration 96
leader 95.1
man 94.7
actress 93.7
indoors 92.7
sit 91.7
family 91.2
furniture 91.2
outfit 90.6
group together 90.5
chair 89.2
home 89
canine 89
many 88.9
child 85.8

Imagga
created on 2018-08-23

man 36.9
male 32.8
people 28.4
person 27
businessman 21.2
business 20
adult 18.4
teacher 17.4
happy 16.9
uniform 16
military uniform 15.8
couple 15.7
portrait 15.5
room 15
blackboard 14.7
office 12.8
work 12.5
holding 12.4
lady 12.2
sexy 12
indoor 11.9
classroom 11.5
boss 11.5
world 11.4
group 11.3
clothing 11
businesswoman 10.9
team 10.7
handsome 10.7
family 10.7
professional 10.7
interior 10.6
fashion 10.5
looking 10.4
education 10.4
men 10.3
attractive 9.8
job 9.7
indoors 9.7
success 9.6
together 9.6
crowd 9.6
child 9.5
meeting 9.4
lifestyle 9.4
silhouette 9.1
old 9
fun 9
student 8.8
educator 8.7
boy 8.7
smiling 8.7
class 8.7
covering 8.6
expression 8.5
casual 8.5
senior 8.4
presentation 8.4
lights 8.3
occupation 8.2
human 8.2
style 8.2
board 8.1
dress 8.1
symbol 8.1
love 7.9
smile 7.8
hands 7.8
black 7.8
color 7.8
sitting 7.7
leader 7.7
school 7.5
design 7.3
home 7.2
icon 7.1
women 7.1
consumer goods 7
happiness 7

Google
created on 2018-08-23

Microsoft
created on 2018-08-23

wall 98.4
floor 93.9
person 92.5
indoor 90.6
posing 71.2

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 36-44
Gender Female, 100%
Calm 49.3%
Surprised 33.5%
Happy 13.4%
Fear 6.7%
Confused 6.1%
Angry 3.7%
Sad 2.5%
Disgusted 0.7%

AWS Rekognition

Age 10-18
Gender Male, 99.9%
Calm 47.4%
Happy 43.4%
Surprised 6.8%
Fear 5.9%
Confused 3.5%
Sad 3.2%
Angry 1.4%
Disgusted 0.3%

AWS Rekognition

Age 12-20
Gender Female, 95.7%
Calm 99%
Surprised 6.3%
Fear 5.9%
Sad 2.3%
Angry 0.2%
Confused 0.2%
Disgusted 0%
Happy 0%

AWS Rekognition

Age 35-43
Gender Male, 99.5%
Calm 77.2%
Happy 21.6%
Surprised 6.4%
Fear 5.9%
Sad 2.2%
Angry 0.3%
Confused 0.2%
Disgusted 0.1%

Microsoft Cognitive Services

Age 51
Gender Male

Microsoft Cognitive Services

Age 48
Gender Female

Microsoft Cognitive Services

Age 15
Gender Female

Microsoft Cognitive Services

Age 15
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.2%
Adult 99.1%
Male 99.1%
Man 99.1%
Dog 95.9%
Suit 88.6%

Text analysis

Amazon

DENVER
PROOF
LAINSON
BETURNED
STARS