Human Generated Data

Title

Untitled (older man talking with young family in suburban neighborhood)

Date

c. 1950

People

Artist: Harry Annas, American 1897 - 1980

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.2836

Human Generated Data

Title

Untitled (older man talking with young family in suburban neighborhood)

People

Artist: Harry Annas, American 1897 - 1980

Date

c. 1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.2836

Machine Generated Data

Tags

Amazon
created on 2022-01-16

Person 99.8
Human 99.8
Person 99.6
Clothing 99.5
Apparel 99.5
Person 99.1
Grass 98.9
Plant 98.9
Outdoors 96.2
Nature 95.8
Shelter 92.7
Building 92.7
Countryside 92.7
Rural 92.7
People 90.5
Field 89.6
Female 83.1
Yard 78.1
Face 77.4
Pants 75.5
Housing 75
Sport 73
Sports 73
Shorts 72.9
Portrait 72.4
Photography 72.4
Photo 72.4
Team Sport 70.1
Team 70.1
Woman 63.9
Kid 63.8
Child 63.8
Grassland 63.4
Girl 63
Tree 59.1
Ground 56.7

Clarifai
created on 2023-10-26

people 99.9
group together 99.4
child 99.1
two 98.9
three 98.8
adult 98.4
four 98.2
group 98.2
man 96.5
woman 95.7
recreation 94.6
family 93.3
boy 93.3
several 93.2
monochrome 90.7
home 89.1
five 87.8
many 86.1
sibling 84.3
war 82

Imagga
created on 2022-01-16

brass 69.7
wind instrument 66.9
trombone 58
musical instrument 45.8
man 37
sax 33.7
sport 28.3
grass 26.9
sky 26.2
adult 24.1
male 23.4
people 22.3
outdoors 21.7
outdoor 21.4
field 20.1
active 20
summer 19.3
leisure 18.3
person 17.8
play 17.2
golf 17.2
sunset 17.1
recreation 17
swing 16.6
club 16
golfer 15.9
freedom 15.6
ball 14.9
playing 14.6
happy 14.4
meadow 13.5
game 13.4
happiness 13.3
silhouette 13.2
player 12.8
day 12.6
lifestyle 12.3
free 12.2
action 12.1
outside 12
relax 11.8
tee 11.7
course 11.5
sun 11.3
success 11.3
exercise 10.9
suit 10.8
golfing 10.7
hit 10.7
fun 10.5
cornet 10.3
clouds 10.1
guy 10.1
competition 10.1
joy 10
activity 9.9
athlete 9.8
businessman 9.7
landscape 9.7
equipment 9.5
men 9.4
beach 9.3
environment 9
weapon 8.9
disaster 8.8
practice 8.7
couple 8.7
bass 8.6
sunny 8.6
bassoon 8.6
accordion 8.4
park 8.2
protection 8.2
vacation 8.2
country 7.9
business 7.9
love 7.9
destruction 7.8
attractive 7.7
youth 7.7
hole 7.7
two 7.6
hobby 7.6
dark 7.5
teen 7.4
danger 7.3
copy space 7.2
professional 7.2
mountain 7.1
portrait 7.1
sea 7

Microsoft
created on 2022-01-16

outdoor 99.4
black and white 89.7
text 88.1
person 82
clothing 51.1

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 48-56
Gender Male, 99.9%
Calm 97.5%
Sad 0.7%
Surprised 0.5%
Happy 0.5%
Disgusted 0.3%
Confused 0.2%
Angry 0.1%
Fear 0.1%

AWS Rekognition

Age 38-46
Gender Male, 91.7%
Calm 99.2%
Sad 0.4%
Surprised 0.1%
Disgusted 0.1%
Happy 0.1%
Confused 0.1%
Angry 0.1%
Fear 0%

AWS Rekognition

Age 24-34
Gender Female, 68.4%
Calm 98.2%
Confused 0.6%
Sad 0.3%
Surprised 0.3%
Disgusted 0.2%
Happy 0.2%
Fear 0.2%
Angry 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.8%

Text analysis

Amazon

>