Human Generated Data

Title

Untitled (man hammering nails into wood, others watching)

Date

1947

People

Artist: John Howell, American active 1930s-1960s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.21709

Human Generated Data

Title

Untitled (man hammering nails into wood, others watching)

People

Artist: John Howell, American active 1930s-1960s

Date

1947

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.21709

Machine Generated Data

Tags

Amazon
created on 2022-03-11

Clothing 99.9
Apparel 99.9
Person 99.9
Human 99.9
Shorts 99.9
Person 99.8
Person 99.7
Person 98.9
Tie 96.6
Accessories 96.6
Accessory 96.6
Person 96.5
Grass 96.1
Plant 96.1
Suit 92.2
Overcoat 92.2
Coat 92.2
Shoe 83.2
Footwear 83.2
People 75
Shoe 70.5
Female 69.7
Face 66.9
Outdoors 63.4
Hat 59
Man 58.6
Tuxedo 57.1

Clarifai
created on 2023-10-22

people 99.9
group 99.1
group together 98.5
adult 96.9
man 96.6
woman 96
child 95.7
leader 95.5
administration 94.9
several 94.8
many 93.9
family 89.4
recreation 87.4
five 87.4
actor 85
three 84.7
four 80.5
military 79.6
police 79
war 78.9

Imagga
created on 2022-03-11

people 29
man 24.2
silhouette 24
person 21.3
male 20.6
adult 19.6
world 18.3
men 15.5
portrait 14.2
spectator 13.7
dark 13.4
fashion 12.8
black 12.7
happy 12.5
group 11.3
women 11.1
model 10.9
city 10.8
light 10.7
attractive 10.5
couple 10.4
walking 10.4
body 10.4
teacher 10.3
business 10.3
musical instrument 10.2
window 10.2
human 9.7
style 9.6
urban 9.6
child 9.5
hair 9.5
sunset 9
fun 9
professional 8.8
together 8.8
boy 8.7
happiness 8.6
educator 8.5
old 8.4
lady 8.1
sexy 8
device 7.9
love 7.9
travel 7.7
dancing 7.7
youth 7.7
walk 7.6
one 7.5
life 7.4
street 7.4
back 7.3
night 7.1
businessman 7.1
work 7.1

Google
created on 2022-03-11

Microsoft
created on 2022-03-11

clothing 98.6
outdoor 94.2
person 92.5
man 92.1
black and white 87.4
footwear 85.5
text 82.7
street 56.1

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 43-51
Gender Male, 84.6%
Calm 93.2%
Sad 5%
Confused 1.1%
Disgusted 0.2%
Surprised 0.2%
Happy 0.1%
Angry 0.1%
Fear 0%

AWS Rekognition

Age 30-40
Gender Female, 57.1%
Sad 67.9%
Calm 23.1%
Happy 7%
Confused 0.6%
Disgusted 0.5%
Angry 0.4%
Fear 0.2%
Surprised 0.2%

AWS Rekognition

Age 23-31
Gender Male, 96.8%
Calm 91%
Happy 4.8%
Sad 2.9%
Confused 0.5%
Disgusted 0.5%
Angry 0.1%
Surprised 0.1%
Fear 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Tie
Shoe
Hat
Person 99.9%
Person 99.8%
Person 99.7%
Person 98.9%
Person 96.5%
Tie 96.6%
Shoe 83.2%
Shoe 70.5%
Hat 59%

Categories

Text analysis

Amazon

M III
M III VI7702 02240
VI7702
02240