Human Generated Data

Title

Untitled (family portrait)

Date

c. 1950

People

Artist: Lainson Studios,

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.21889

Human Generated Data

Title

Untitled (family portrait)

People

Artist: Lainson Studios,

Date

c. 1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.21889

Machine Generated Data

Tags

Amazon
created on 2023-10-25

Lamp 100
Table Lamp 99.9
Architecture 99.8
Building 99.8
Furniture 99.8
Indoors 99.8
Living Room 99.8
Room 99.8
People 99.7
Person 99.1
Adult 99.1
Adult 99.1
Bride 99.1
Female 99.1
Female 99.1
Wedding 99.1
Woman 99.1
Person 99.1
Person 98.9
Adult 98.9
Female 98.9
Woman 98.9
Person 98.3
Adult 98.3
Male 98.3
Man 98.3
Face 97
Head 97
Photography 97
Portrait 97
Animal 96.4
Canine 96.4
Dog 96.4
Mammal 96.4
Pet 96.4
Couch 96.4
Clothing 92.6
Formal Wear 92.6
Suit 92.6
Suit 90.1
Chair 78.7
Dining Room 73.6
Dining Table 73.6
Table 73.6
Coat 67.5
Footwear 67.3
Shoe 67.3
Shoe 64.3
Fireplace 62.1
Cabinet 57.9
Dress 57.7
Chandelier 57.4
Hound 57.2
Interior Design 56.8
Lady 56.6
Jacket 56.5
Shoe 56.5
Lampshade 56.5
Puppy 56.1
Accessories 56
Bulldog 55.3
Armchair 55.3
Tuxedo 55.2
Skirt 55.2

Clarifai
created on 2018-08-23

people 99.9
group 99.1
administration 98.8
adult 97.3
leader 97.2
room 95.5
woman 94.9
group together 94.2
man 93.7
chair 91.9
home 91.6
furniture 90.5
actress 90.4
many 90.1
several 88
indoors 86.7
five 86.6
family 84.2
sit 83.2
outfit 83.1

Imagga
created on 2018-08-23

man 26.8
people 25.1
person 24.6
blackboard 23.3
male 22.9
happy 18.2
uniform 17.7
child 17.6
adult 17.3
world 16.5
military uniform 15.6
portrait 15.5
business 15.2
businessman 15
teacher 15
clothing 14.8
room 13.4
classroom 13.3
lifestyle 12.3
home 11.9
attractive 11.9
indoor 11.9
indoors 11.4
lady 11.3
fashion 11.3
boy 11.3
casual 11
smiling 10.8
smile 10.7
women 10.3
girls 10
professional 9.9
family 9.8
interior 9.7
group 9.7
style 9.6
couple 9.6
work 9.4
happiness 9.4
expression 9.4
covering 9.2
businesswoman 9.1
dress 9
fun 9
handsome 8.9
office 8.8
sexy 8.8
looking 8.8
together 8.8
kin 8.7
education 8.6
sitting 8.6
face 8.5
pretty 8.4
color 8.3
holding 8.2
board 8.1
kid 8
school 7.8
black 7.8
dance 7.7
men 7.7
couch 7.7
class 7.7
old 7.7
reading 7.6
educator 7.6
communication 7.5
student 7.5
house 7.5
teen 7.3
cheerful 7.3
success 7.2
chair 7.2
holiday 7.2
consumer goods 7.2
trainer 7.1

Google
created on 2018-08-23

Microsoft
created on 2018-08-23

wall 98.4
floor 93.1
indoor 92.5
person 88.8
posing 61.9
old 48.9

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 38-46
Gender Female, 100%
Happy 43.8%
Surprised 25.7%
Calm 23.8%
Fear 7.4%
Confused 4.7%
Angry 3.3%
Sad 2.4%
Disgusted 0.7%

AWS Rekognition

Age 9-17
Gender Male, 100%
Happy 65.2%
Calm 21.7%
Confused 10.3%
Surprised 6.9%
Fear 5.9%
Sad 2.3%
Angry 1%
Disgusted 0.2%

AWS Rekognition

Age 16-22
Gender Female, 96.6%
Calm 99%
Surprised 6.3%
Fear 5.9%
Sad 2.3%
Angry 0.2%
Confused 0.1%
Happy 0%
Disgusted 0%

AWS Rekognition

Age 36-44
Gender Male, 99.8%
Happy 98.3%
Surprised 6.4%
Fear 5.9%
Sad 2.2%
Calm 1.2%
Angry 0.1%
Confused 0.1%
Disgusted 0.1%

Microsoft Cognitive Services

Age 50
Gender Male

Microsoft Cognitive Services

Age 45
Gender Female

Microsoft Cognitive Services

Age 19
Gender Female

Microsoft Cognitive Services

Age 15
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Likely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.1%
Adult 99.1%
Bride 99.1%
Female 99.1%
Woman 99.1%
Male 98.3%
Man 98.3%
Dog 96.4%
Suit 92.6%
Shoe 67.3%

Text analysis

Amazon

DENVER
PROOF
ENVAR
LAINSON
MAINSON
NOSS
NOSS MUTT
ورة
MUTT