Human Generated Data

Title

Untitled (proof of three children with dog posing in front of fireplace, Christmas picture)

Date

1951

People

Artist: Francis J. Sullivan, American 1916 - 1996

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.18253

Human Generated Data

Title

Untitled (proof of three children with dog posing in front of fireplace, Christmas picture)

People

Artist: Francis J. Sullivan, American 1916 - 1996

Date

1951

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.18253

Machine Generated Data

Tags

Amazon
created on 2022-02-25

Furniture 99.9
Person 99.6
Human 99.6
Person 99.4
Person 98.4
Dog 97.9
Mammal 97.9
Animal 97.9
Canine 97.9
Pet 97.9
Chair 91
Couch 80.7
People 74.8
Clothing 72.9
Apparel 72.9
Shoe 72.2
Footwear 72.2
Room 66.1
Indoors 66.1
Shorts 56.5

Clarifai
created on 2023-10-22

people 99.9
child 99.4
group 99.2
canine 99
furniture 96.7
sit 95.7
four 95.4
son 95.1
family 95
group together 95
adult 95
three 94.6
portrait 94.4
offspring 94.3
room 93.9
chair 93.7
sibling 93.5
woman 93
education 91.6
wear 91.1

Imagga
created on 2022-02-25

afghan hound 61.6
kin 57.5
hound 53.2
hunting dog 48
dog 46.5
people 25.6
happy 23.2
family 21.3
canine 19.7
senior 19.7
man 18.8
portrait 18.8
male 18.4
old 17.4
smiling 17.3
elderly 17.2
outdoors 17.2
person 17
home 16.7
together 15.8
adult 15.6
child 15.5
face 14.9
sitting 14.6
lifestyle 14.4
smile 14.2
domestic animal 14.2
mother 14.2
wheelchair 13.7
cute 13.6
chair 13.4
happiness 13.3
setter 13
fun 12.7
sporting dog 12.1
mature 12.1
pet 11.9
playing 11.8
head 11.7
retired 11.6
lady 11.4
couple 11.3
pensioner 10.8
couch 10.6
cheerful 10.6
children 10
aged 9.9
parent 9.7
retirement 9.6
looking 9.6
love 9.5
togetherness 9.4
friends 9.4
park 9.1
dress 9
father 8.9
little 8.8
play 8.6
daughter 8.6
men 8.6
outside 8.5
expression 8.5
two 8.5
room 8.4
care 8.2
childhood 8.1
grandmother 7.8
attractive 7.7
health 7.6
husband 7.6
illness 7.6
casual 7.6
friend 7.6
friendship 7.5
leisure 7.5
animals 7.4
adorable 7.4
indoor 7.3
trainer 7.3
girls 7.3
hair 7.1
women 7.1
grass 7.1
day 7.1
look 7

Google
created on 2022-02-25

Microsoft
created on 2022-02-25

wall 98.7
clothing 97.4
person 95.3
sitting 94.5
indoor 91
human face 89.7
smile 88.5
text 86.6
child 85.1
baby 77.2
family 72.1
toddler 66.3
footwear 60
furniture 57.8
chair 52.2
woman 50.7
posing 49.6

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 4-12
Gender Female, 99.6%
Calm 96.1%
Happy 1.4%
Angry 0.6%
Sad 0.5%
Disgusted 0.4%
Surprised 0.4%
Fear 0.3%
Confused 0.3%

AWS Rekognition

Age 6-16
Gender Male, 100%
Happy 98.2%
Calm 1.2%
Surprised 0.3%
Confused 0.1%
Disgusted 0%
Angry 0%
Sad 0%
Fear 0%

AWS Rekognition

Age 2-8
Gender Male, 98.9%
Calm 98.3%
Sad 0.6%
Happy 0.3%
Disgusted 0.2%
Surprised 0.2%
Confused 0.2%
Angry 0.1%
Fear 0.1%

Microsoft Cognitive Services

Age 5
Gender Female

Microsoft Cognitive Services

Age 7
Gender Male

Microsoft Cognitive Services

Age 5
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Likely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Dog
Chair
Shoe
Person 99.6%
Person 99.4%
Person 98.4%
Dog 97.9%
Chair 91%
Shoe 72.2%

Categories

Imagga

people portraits 99.8%

Text analysis

Amazon

SULLIVAR
PRILLIYAN