Human Generated Data

Title

Untitled (Horse Dance, Java)

Date

January 26, 1960-February 2, 1960

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.5169

Human Generated Data

Title

Untitled (Horse Dance, Java)

People

Artist: Ben Shahn, American 1898 - 1969

Date

January 26, 1960-February 2, 1960

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.5169

Machine Generated Data

Tags

Amazon
created on 2023-10-05

Architecture 100
Building 100
Factory 100
Assembly Line 100
Manufacturing 100
Adult 98.7
Male 98.7
Man 98.7
Person 98.7
Male 97.4
Person 97.4
Boy 97.4
Child 97.4
Face 92.8
Head 92.8
Adult 88.6
Male 88.6
Man 88.6
Person 88.6
Adult 80.5
Person 80.5
Bride 80.5
Female 80.5
Wedding 80.5
Woman 80.5
Person 74.4
Outdoors 57.5
Weaving 56.7
Worker 55.5

Clarifai
created on 2018-05-10

people 99.9
adult 99.1
group together 98
group 97.7
man 97.1
vehicle 95.8
war 95.1
many 93.2
one 92.4
woman 91.7
administration 90.6
transportation system 89.9
military 89
two 87.4
wear 87.1
monochrome 84.4
several 84.1
soldier 84
weapon 82.1
three 79.3

Imagga
created on 2023-10-05

crutch 22.8
staff 17.7
sax 17.4
brass 17
wind instrument 16.9
device 14.6
stick 13.7
man 13.4
person 13.2
people 12.8
outdoors 12.7
building 11.9
old 11.8
machine 11.2
adult 10.3
steel 9.7
metal 9.6
black 9.6
iron 9.3
portrait 9.1
trombone 8.8
bridle 8.8
male 8.6
architecture 8.6
horn 8.5
city 8.3
light 8
smiling 8
musical instrument 7.9
wall 7.8
ancient 7.8
summer 7.7
outside 7.7
snow 7.6
happy 7.5
wood 7.5
equipment 7.4
vintage 7.4
close 7.4
window 7.3
industrial 7.3
lifestyle 7.2
bassoon 7.2
modern 7

Google
created on 2018-05-10

Microsoft
created on 2018-05-10

person 98.9
outdoor 92

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 22-30
Gender Male, 98.6%
Sad 96%
Calm 46.8%
Surprised 6.4%
Fear 6.1%
Confused 1.9%
Angry 0.7%
Disgusted 0.5%
Happy 0.4%

AWS Rekognition

Age 26-36
Gender Male, 98.4%
Sad 69.4%
Calm 56.2%
Fear 9.4%
Surprised 6.5%
Confused 2.3%
Angry 0.4%
Disgusted 0.3%
Happy 0.3%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Adult 98.7%
Male 98.7%
Man 98.7%
Person 98.7%
Boy 97.4%
Child 97.4%
Bride 80.5%
Female 80.5%
Woman 80.5%

Captions