Human Generated Data

Title

Untitled (african american men working at underground construction site)

Date

1952

People

Artist: Orrion Barger, American active 1913 - 1984

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.6305

Human Generated Data

Title

Untitled (african american men working at underground construction site)

People

Artist: Orrion Barger, American active 1913 - 1984

Date

1952

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.6305

Machine Generated Data

Tags

Amazon
created on 2022-01-22

Person 99.6
Human 99.6
Person 99.5
Person 99.1
Person 94.6
Outdoors 83.2
Nature 82.8
Person 80.3
People 76.4
Silhouette 72.6
Airplane 69.8
Aircraft 69.8
Vehicle 69.8
Transportation 69.8
Pedestrian 68.3
Photography 61.7
Photo 61.7
Clothing 56.6
Apparel 56.6
Duel 55.6
Snow 55.1
Sketch 55.1
Art 55.1
Drawing 55.1
Person 45.3

Clarifai
created on 2023-10-26

people 100
group together 99
man 97.8
adult 96.4
watercraft 96.3
group 95.7
two 93.5
military 93.1
vehicle 93
action 92.5
child 92.1
wear 92
three 90.5
transportation system 89.4
recreation 88.1
aircraft 87.1
soldier 85.6
woman 85.3
many 84.2
war 83.5

Imagga
created on 2022-01-22

sunset 37.7
beach 37.4
water 34
ocean 32.3
sea 32.1
silhouette 27.3
sky 24.2
sand 20
sun 18.1
boat 17.9
travel 17.6
summer 17.4
coast 17.1
people 16.7
ship 16.7
vessel 16.6
sport 16.5
vacation 16.4
landscape 16.4
man 15.5
fishing 15.4
outdoors 15.1
wave 14.7
evening 14
waves 13.9
male 13.5
light 13.4
fun 12.7
dark 12.5
sunrise 12.2
recreation 11.7
fisherman 11.6
dusk 11.4
clouds 11
exercise 10.9
craft 10.4
shore 10.2
relax 10.1
lake 10.1
leisure 10
horizon 9.9
wreck 9.3
action 9.3
tree 9.2
active 9.2
river 8.9
dawn 8.7
lifestyle 8.7
shipwreck 8.6
cloud 8.6
holiday 8.6
mechanical device 8.5
adult 8.5
two 8.5
relaxation 8.4
island 8.2
calm 8.2
person 8.2
swing 7.9
black 7.8
boats 7.8
bay 7.7
orange 7.7
tropical 7.7
harp 7.6
paradise 7.5
coastline 7.5
happy 7.5
tranquil 7.2
stringed instrument 7.2
romantic 7.1
sunlight 7.1
mechanism 7.1
scenic 7

Google
created on 2022-01-22

Microsoft
created on 2022-01-22

water 94.7
ship 92.1
text 87.2
black and white 85
watercraft 82.1
boat 81.8
person 54.9
old 51.9

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 27-37
Gender Male, 99.9%
Calm 96.9%
Confused 1.2%
Surprised 0.6%
Disgusted 0.5%
Happy 0.3%
Fear 0.2%
Sad 0.1%
Angry 0.1%

AWS Rekognition

Age 6-14
Gender Female, 94.6%
Calm 77.5%
Sad 18.3%
Surprised 1.1%
Confused 0.9%
Fear 0.7%
Disgusted 0.6%
Happy 0.5%
Angry 0.5%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.6%
Airplane 69.8%