Human Generated Data

Title

Untitled (studio portrait of family consisting of five women and four men in various seated and standing poses against painted backdrop)

Date

c. 1910-1920

People

Artist: Durette Studio, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.3604

Human Generated Data

Title

Untitled (studio portrait of family consisting of five women and four men in various seated and standing poses against painted backdrop)

People

Artist: Durette Studio, American 20th century

Date

c. 1910-1920

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.3604

Machine Generated Data

Tags

Amazon
created on 2019-06-01

Human 99.2
Person 99.2
Person 99
Person 98.8
Person 98.6
Person 98.3
Person 96.5
Person 96.4
Person 95.8
Tie 95.8
Accessory 95.8
Accessories 95.8
Clothing 95.4
Apparel 95.4
Person 89.6
Shorts 84.6
Female 72.7
People 71.9
Girl 62
Face 58.8
Coat 58
Nurse 56.1
Shoe 56
Footwear 56

Clarifai
created on 2019-06-01

people 99.7
adult 97.8
group 96.3
man 95.2
woman 93.2
group together 91.4
wear 90.6
several 88.1
many 84
portrait 84
uniform 83.6
four 79.2
outfit 74.6
veil 74.6
five 74.1
child 73.7
retro 73.4
monochrome 72.7
music 71.2
three 70.2

Imagga
created on 2019-06-01

negative 48.7
film 40.3
photographic paper 29.7
people 24.5
person 23.7
man 21.5
photographic equipment 19.8
male 19.1
adult 16.9
human 14.2
silhouette 14.1
men 13.7
art 13.6
portrait 12.9
party 12.9
sport 12.6
world 11.8
business 11.5
group 11.3
active 10.8
crowd 10.6
white 10.4
dark 10
dancer 9.9
mask 9.8
fun 9.7
businessman 9.7
black 9.6
motion 9.4
lifestyle 9.4
grunge 9.4
face 9.2
athlete 9
celebration 8.8
symbol 8.7
women 8.7
dance 8.6
design 8.4
outdoor 8.4
exercise 8.2
team 8.1
happiness 7.8
boy 7.8
color 7.8
summer 7.7
life 7.7
old 7.7
sky 7.6
hand 7.6
head 7.6
sign 7.5
leisure 7.5
event 7.4
freedom 7.3
kin 7

Google
created on 2019-06-01

Photograph 96.3
Snapshot 83.3
Photography 62.4
Family 53.1
Team 52.3

Microsoft
created on 2019-06-01

posing 91.5
text 90.8
person 88.7
window 87.3
clothing 86.1
black and white 69.5
old 56.6

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 26-43
Gender Male, 53.9%
Sad 45.3%
Surprised 45.3%
Happy 45.9%
Angry 45.2%
Calm 52.9%
Confused 45.1%
Disgusted 45.3%

AWS Rekognition

Age 26-43
Gender Male, 52.4%
Disgusted 45%
Sad 45.2%
Happy 45.1%
Surprised 45%
Calm 54.6%
Angry 45%
Confused 45%

AWS Rekognition

Age 26-43
Gender Female, 50.5%
Surprised 45.1%
Sad 45.7%
Angry 45.2%
Disgusted 45.1%
Calm 53.2%
Happy 45.5%
Confused 45.2%

AWS Rekognition

Age 35-52
Gender Male, 54.7%
Sad 46.8%
Surprised 45.6%
Disgusted 46%
Angry 45.4%
Calm 50.2%
Happy 45.6%
Confused 45.4%

AWS Rekognition

Age 20-38
Gender Male, 51.1%
Sad 45.7%
Surprised 45.2%
Happy 45.3%
Angry 45.2%
Calm 53.5%
Confused 45.1%
Disgusted 45.1%

AWS Rekognition

Age 20-38
Gender Female, 51.9%
Disgusted 45.1%
Confused 45.1%
Angry 45.1%
Calm 53.7%
Surprised 45.1%
Happy 45.1%
Sad 45.8%

AWS Rekognition

Age 20-38
Gender Female, 53.3%
Sad 45.4%
Confused 45.1%
Disgusted 45.1%
Surprised 45.2%
Angry 45.2%
Happy 45.7%
Calm 53.4%

AWS Rekognition

Age 15-25
Gender Male, 54%
Sad 45.5%
Surprised 45.2%
Disgusted 45.1%
Angry 45.1%
Calm 52.9%
Happy 45.9%
Confused 45.1%

AWS Rekognition

Age 26-43
Gender Male, 53.2%
Disgusted 45%
Happy 45.2%
Sad 45.4%
Calm 53.9%
Angry 45.2%
Surprised 45.1%
Confused 45.1%

Feature analysis

Amazon

Person 99.2%
Tie 95.8%
Shoe 56%

Categories

Text analysis

Amazon

AAANTYSam
OTOOUUKN
w