Human Generated Data

Title

Untitled (three children lined up for performance)

Date

c. 1940, printed later

People

Artist: Harry Annas, American 1897 - 1980

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.6702

Human Generated Data

Title

Untitled (three children lined up for performance)

People

Artist: Harry Annas, American 1897 - 1980

Date

c. 1940, printed later

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.6702

Machine Generated Data

Tags

Amazon
created on 2019-11-16

Human 97.7
Military 97.1
Person 96.7
Person 96.1
Military Uniform 96.1
Person 95
Flooring 92.9
Officer 82.2
Helmet 71.1
Apparel 71.1
Clothing 71.1
Soldier 69.3
Art 64.8
Figurine 60.7
Army 59.3
Armored 59.3

Clarifai
created on 2019-11-16

people 99.7
man 97.7
adult 96.9
woman 96
group 94.7
two 93.5
monochrome 91.6
indoors 89
group together 88.1
wear 85.8
museum 84.9
one 84.6
three 84.2
room 83.6
portrait 83.5
child 82.7
movie 82.6
television 81.4
music 81.3
love 80.1

Imagga
created on 2019-11-16

people 24
television 22.4
man 21.7
person 21.1
black 20.8
silhouette 19
male 17.7
adult 15.7
body 13.6
dark 13.4
telecommunication system 13.2
business 12.7
businessman 12.4
fashion 12.1
human 12
one 11.9
women 11.9
love 11.8
portrait 11.6
boy 11.4
office 11.3
attractive 11.2
hair 11.1
window 10.9
model 10.9
symbol 10.1
posing 9.8
group 9.7
style 9.6
performer 9.1
sensuality 9.1
fun 9
device 8.9
happy 8.8
light 8.7
crowd 8.6
design 8.6
art 8.6
groom 8.5
youth 8.5
passion 8.5
dancer 8.4
studio 8.4
sexy 8
urban 7.9
monitor 7.9
couple 7.8
sitting 7.7
men 7.7
screen 7.5
background 7.3
laptop 7.2
looking 7.2
home 7.2
smile 7.1
face 7.1
computer 7

Google
created on 2019-11-16

Microsoft
created on 2019-11-16

text 98.8
clothing 95.6
person 91.3
monitor 90.7
indoor 86.1
electronics 85.7
black and white 85.6
footwear 75
white 62.9
man 62.4
computer 57
posing 53.3
old 51.7
image 33.6
picture frame 21.7

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 7-17
Gender Male, 52.4%
Fear 45%
Angry 45%
Sad 45.1%
Happy 45.1%
Calm 54.8%
Confused 45%
Surprised 45%
Disgusted 45%

AWS Rekognition

Age 3-9
Gender Female, 54.6%
Calm 53%
Sad 45.3%
Angry 45.2%
Disgusted 45.1%
Happy 45.2%
Surprised 45.5%
Fear 45.1%
Confused 45.5%

AWS Rekognition

Age 11-21
Gender Female, 52.5%
Sad 45.1%
Disgusted 45%
Surprised 46.1%
Happy 52.2%
Angry 45.1%
Fear 45.1%
Confused 45.4%
Calm 46%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Possible
Blurred Very unlikely

Feature analysis

Amazon

Person 96.7%
Helmet 71.1%

Categories