Human Generated Data

Title

Untitled (family portrait in living room)

Date

c. 1950

People

Artist: Lainson Studios,

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.21874

Human Generated Data

Title

Untitled (family portrait in living room)

People

Artist: Lainson Studios,

Date

c. 1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.21874

Machine Generated Data

Tags

Amazon
created on 2022-03-11

Person 99.8
Human 99.8
Clothing 99.6
Apparel 99.6
Chair 99.2
Furniture 99.2
Person 98.9
Person 98.8
Person 98.2
Dog 90.1
Mammal 90.1
Animal 90.1
Canine 90.1
Pet 90.1
Interior Design 87.8
Indoors 87.8
Dress 87.4
Female 86.7
Tie 86.2
Accessories 86.2
Accessory 86.2
Sleeve 80
Coat 79.5
Clinic 77.8
Room 76
Woman 72.1
Pillow 72
Cushion 72
Suit 70.9
Overcoat 70.9
People 66.8
Girl 65.2
Portrait 64.9
Photography 64.9
Face 64.9
Photo 64.9
Floor 64.7
Lab Coat 64.5
Flooring 59.2
Shirt 59
Curtain 58.4
Shoe 58
Footwear 58
Kid 57.5
Child 57.5
Long Sleeve 57

Clarifai
created on 2023-10-22

people 99.8
wedding 99.3
bride 99
veil 97.2
two 97.1
woman 96.8
group 96.7
groom 96
dress 95.9
adult 94.8
bridal 90.7
man 90.2
wear 89.8
bridesmaid 86.7
indoors 86.1
marriage 85.8
three 85.1
gown 84.5
one 83.6
girl 83.5

Imagga
created on 2022-03-11

people 38.5
adult 31.9
person 28.4
happy 26.9
smiling 26.8
man 26.2
dress 26.2
male 24.9
happiness 24.3
women 23.7
couple 23.5
fashion 21.1
boutique 20.1
portrait 19.4
lifestyle 18.8
groom 18.1
bride 17.9
casual 16.9
looking 16.8
love 16.6
wedding 16.6
smile 16.4
men 16.3
interior 15.9
clothing 15.8
indoors 15.8
standing 15.6
professional 15.6
cheerful 15.4
pretty 15.4
attractive 15.4
two 15.2
home 15.2
family 15.1
together 14.9
holding 14.9
lady 14.6
domestic 14.1
indoor 13.7
work 13.5
business 13.4
shopping 13.3
walking 13.3
room 13.3
shopper 13.2
style 12.6
day 12.6
modern 11.9
elegance 11.8
bags 11.7
full length 11.6
marriage 11.4
wife 11.4
group 11.3
corporate 11.2
20s 11
bag 10.9
hand 10.6
suit 10.3
holiday 10
joy 10
face 9.9
outfit 9.8
consumerism 9.8
husband 9.7
human 9.7
office 9.7
businessman 9.7
adults 9.5
clothes 9.4
camera 9.2
house 9.2
children 9.1
old 9.1
fun 9
one 9
daughter 8.9
hospital 8.9
mall 8.8
carrying 8.7
married 8.6
customer 8.6
elegant 8.6
females 8.5
store 8.5
togetherness 8.5
sale 8.3
life 8.3
shop 8.2
businesswoman 8.2
new 8.1
activity 8.1
bouquet 8
color 7.8
mother 7.8
health 7.6
child 7.6
worker 7.6
friends 7.5
teamwork 7.4
girls 7.3
father 7.2
success 7.2
stylish 7.2
cute 7.2
team 7.2
romantic 7.1
posing 7.1

Google
created on 2022-03-11

Microsoft
created on 2022-03-11

text 98.4
dress 93.7
wedding 85.4
wedding dress 81.6
bride 72.7
woman 64
clothing 61.6
clothes 16.3

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 30-40
Gender Male, 59.4%
Happy 92.2%
Surprised 3.4%
Calm 3.3%
Disgusted 0.4%
Fear 0.3%
Sad 0.2%
Confused 0.1%
Angry 0.1%

AWS Rekognition

Age 49-57
Gender Male, 99.7%
Happy 78.9%
Surprised 7.3%
Sad 5.5%
Calm 2.6%
Disgusted 2.1%
Confused 1.6%
Angry 1.1%
Fear 0.9%

AWS Rekognition

Age 29-39
Gender Female, 66.4%
Calm 68.4%
Happy 23.9%
Sad 2.1%
Confused 2.1%
Disgusted 1.1%
Surprised 1.1%
Fear 0.7%
Angry 0.6%

AWS Rekognition

Age 38-46
Gender Male, 99.4%
Calm 97.4%
Sad 0.8%
Happy 0.7%
Confused 0.5%
Surprised 0.2%
Disgusted 0.2%
Angry 0.1%
Fear 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Dog
Tie
Shoe
Person 99.8%
Person 98.9%
Person 98.8%
Person 98.2%
Dog 90.1%
Tie 86.2%
Shoe 58%

Categories

Imagga

interior objects 99.9%

Text analysis

Amazon

SE