Human Generated Data

Title

Untitled (family group standing outside house)

Date

1915

People

Artist: Hamblin Studio, American active 1930s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.2153

Human Generated Data

Title

Untitled (family group standing outside house)

People

Artist: Hamblin Studio, American active 1930s

Date

1915

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.2153

Machine Generated Data

Tags

Amazon
created on 2019-06-01

Human 99.8
Person 99.8
Person 99.7
Person 99.7
Person 99.7
Person 99.4
Clothing 98.8
Apparel 98.8
Person 98.6
Person 98.3
Person 98.2
People 98.1
Family 97.6
Shorts 91
Female 67.7
Shoe 67.5
Footwear 67.5
Person 64.3
Skirt 63.8
Shoe 50.8

Clarifai
created on 2019-06-01

people 99.9
group together 99.8
child 99.5
many 98.9
several 96.8
group 96.5
wear 93.4
adult 92.6
boy 91.6
five 88.1
four 87.1
man 86.5
monochrome 86.4
enjoyment 84.6
woman 81.8
recreation 81.5
education 80.8
sibling 80.3
offspring 80
outfit 78.1

Imagga
created on 2019-06-01

kin 47.9
people 30.7
beach 30.3
man 29.6
runner 29.5
athlete 24.7
person 24.1
male 22
sand 21.8
adult 21.4
child 21.1
outdoors 20.9
lifestyle 20.2
men 19.7
summer 19.3
couple 19.2
happy 18.8
walking 18
sea 18
contestant 17.6
vacation 17.2
active 17.1
outdoor 16.8
family 16
love 15
fun 15
happiness 14.9
sport 14.9
portrait 14.2
together 14
water 14
exercise 13.6
fitness 13.5
running 13.4
ocean 13.3
sibling 13.1
life 12.8
outside 12.8
women 12.6
run 12.5
joy 12.5
leisure 12.5
father 12.3
dress 11.7
silhouette 11.6
smiling 11.6
holiday 11.5
walk 11.4
group 11.3
black 10.8
mother 10.8
dad 10.5
human 10.5
boy 10.4
relationship 10.3
action 10.2
coast 9.9
health 9.7
sky 9.6
motion 9.4
parent 9.2
alone 9.1
fashion 9
nurse 8.9
healthy 8.8
two 8.5
travel 8.4
holding 8.3
children 8.2
sun 8
world 8
business 7.9
jogging 7.9
bright 7.9
son 7.8
play 7.8
sunny 7.7
casual 7.6
clothing 7.5
shore 7.4
fit 7.4
freedom 7.3
recreation 7.2
activity 7.2
smile 7.1
kid 7.1
businessman 7.1
day 7.1

Google
created on 2019-06-01

Microsoft
created on 2019-06-01

clothing 97.7
outdoor 96.4
person 86.8
smile 84.2
boy 81
child 76.7
footwear 70.1
posing 54.1
old 41.8

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 26-43
Gender Male, 52%
Angry 45.2%
Disgusted 45.4%
Confused 45.2%
Calm 51.6%
Sad 46.4%
Surprised 45.4%
Happy 45.7%

AWS Rekognition

Age 26-43
Gender Female, 52.8%
Happy 46.8%
Confused 45.3%
Angry 45.4%
Disgusted 45.8%
Surprised 45.4%
Sad 47.1%
Calm 49.2%

AWS Rekognition

Age 35-52
Gender Male, 51.5%
Disgusted 45.8%
Sad 49.2%
Happy 46%
Confused 45.5%
Surprised 45.6%
Angry 45.5%
Calm 47.3%

AWS Rekognition

Age 35-52
Gender Female, 50.6%
Happy 46%
Disgusted 45.1%
Sad 45.5%
Surprised 45.2%
Angry 45.1%
Calm 52.9%
Confused 45.2%

AWS Rekognition

Age 23-38
Gender Female, 51.1%
Happy 45.8%
Surprised 45.6%
Angry 45.5%
Confused 45.3%
Calm 50.6%
Sad 46.5%
Disgusted 45.8%

AWS Rekognition

Age 26-43
Gender Male, 53.7%
Disgusted 45.5%
Calm 47.3%
Sad 50.8%
Confused 45.4%
Angry 45.3%
Surprised 45.3%
Happy 45.4%

AWS Rekognition

Age 35-52
Gender Male, 54.4%
Angry 45.3%
Happy 45.8%
Sad 45.3%
Disgusted 45.2%
Confused 45.3%
Calm 52.4%
Surprised 45.7%

Feature analysis

Amazon

Person 99.8%
Shoe 67.5%

Categories

Imagga

paintings art 99.9%