Human Generated Data

Title

Untitled (extended family portrait outside house)

Date

c. 1950

People

Artist: Harry Annas, American 1897 - 1980

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.2457

Human Generated Data

Title

Untitled (extended family portrait outside house)

People

Artist: Harry Annas, American 1897 - 1980

Date

c. 1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.2457

Machine Generated Data

Tags

Amazon
created on 2019-06-17

Apparel 99.9
Clothing 99.9
Person 99.2
Human 99.2
Person 99.1
Person 99.1
Person 98.8
Skirt 98.7
Person 98.3
Person 98.3
Accessory 98.2
Tie 98.2
Accessories 98.2
Person 97.7
Person 97.6
Female 97.2
Dress 97
Person 96.2
Person 96.2
People 94.6
Shorts 94.5
Helmet 90
Woman 88.6
Chair 88.5
Furniture 88.5
Person 87.1
Coat 68.5
Overcoat 68.5
Suit 68.5
Girl 65.1
Plant 64.1
Grass 64.1
Child 55.2
Kid 55.2

Clarifai
created on 2019-06-17

people 100
group together 99.8
many 99
adult 98.7
group 98.6
woman 97.5
wear 97.1
several 96.6
child 96.5
outfit 96.2
five 95.9
man 95.7
recreation 93.8
education 92.7
school 92.6
sports equipment 91.7
four 87.7
boy 87.5
athlete 86.1
uniform 85.5

Imagga
created on 2019-06-17

kin 38.2
brass 31
man 29.6
world 26.2
people 26.2
wind instrument 25.3
sport 21.9
male 21.3
musical instrument 20
silhouette 19.9
adult 17.5
couple 16.5
men 16.3
person 15.8
family 15.1
happy 15
active 14.5
outdoors 13.4
love 13.4
black 13.2
summer 12.9
boy 12.2
fun 12
teacher 11.7
sky 11.5
outdoor 11.5
child 11.1
happiness 11
player 10.8
sunset 10.8
ball 10.7
bride 10.5
group 10.5
portrait 10.3
grass 10.3
competition 10.1
field 10
activity 9.8
athlete 9.8
to 9.7
groom 9.6
cornet 9.4
youth 9.4
wedding 9.2
girls 9.1
park 9.1
human 9
play 8.6
wife 8.5
beach 8.4
friendship 8.4
trombone 8.4
playing 8.2
dress 8.1
team 8.1
educator 8
smiling 8
women 7.9
business 7.9
mother 7.7
father 7.7
two 7.6
action 7.4
teenager 7.3
exercise 7.3
fitness 7.2
body 7.2
game 7.1
room 7.1
businessman 7.1
together 7

Google
created on 2019-06-17

Microsoft
created on 2019-06-17

person 97.6
outdoor 95.8
clothing 86.9
group 84
footwear 78.8
posing 72.5
sport 67
smile 55.4

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 35-53
Gender Female, 53.2%
Calm 49.4%
Angry 45.6%
Happy 45.2%
Sad 48.5%
Confused 45.5%
Surprised 45.4%
Disgusted 45.3%

AWS Rekognition

Age 16-27
Gender Female, 54.2%
Confused 45.2%
Calm 51.7%
Angry 45.3%
Sad 47.3%
Disgusted 45.1%
Happy 45.3%
Surprised 45.2%

AWS Rekognition

Age 27-44
Gender Female, 53.8%
Disgusted 45.1%
Confused 45.1%
Calm 45.4%
Angry 45.3%
Happy 49.7%
Sad 49.4%
Surprised 45.1%

AWS Rekognition

Age 26-43
Gender Male, 53.2%
Disgusted 45.2%
Calm 53.4%
Happy 45.4%
Sad 45.7%
Angry 45.1%
Surprised 45.1%
Confused 45.1%

AWS Rekognition

Age 26-43
Gender Male, 52.9%
Confused 45.3%
Happy 46.5%
Disgusted 45.2%
Calm 46.4%
Angry 45.3%
Sad 51.2%
Surprised 45.2%

AWS Rekognition

Age 11-18
Gender Female, 54.4%
Angry 45.3%
Calm 47.6%
Disgusted 45.1%
Surprised 45.1%
Confused 45.1%
Happy 45.7%
Sad 51%

AWS Rekognition

Age 26-44
Gender Female, 50.8%
Angry 45.3%
Happy 45.1%
Confused 45.2%
Calm 53.3%
Disgusted 45%
Sad 45.8%
Surprised 45.2%

AWS Rekognition

Age 14-23
Gender Male, 52.4%
Sad 48.4%
Surprised 45.6%
Disgusted 45.5%
Calm 45.7%
Angry 45.9%
Confused 45.8%
Happy 48.1%

AWS Rekognition

Age 48-68
Gender Female, 53.3%
Calm 52.1%
Happy 45.3%
Disgusted 45.1%
Sad 46.1%
Surprised 45.4%
Angry 45.4%
Confused 45.5%

AWS Rekognition

Age 26-43
Gender Female, 53.1%
Sad 48%
Confused 45.3%
Calm 47.5%
Happy 46.4%
Surprised 45.8%
Disgusted 46%
Angry 45.9%

Feature analysis

Amazon

Person 99.2%
Tie 98.2%
Helmet 90%

Text analysis

Amazon

RODVK
Y