Human Generated Data

Title

Untitled (three men standing at oil well site)

Date

1966

People

Artist: Harry Annas, American 1897 - 1980

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.2608

Human Generated Data

Title

Untitled (three men standing at oil well site)

People

Artist: Harry Annas, American 1897 - 1980

Date

1966

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.2608

Machine Generated Data

Tags

Amazon
created on 2022-01-15

Person 99.8
Human 99.8
Person 99.7
Person 99.5
Clothing 98.3
Apparel 98.3
Person 95.9
Person 91.1
Shoe 87
Footwear 87
Face 81
Female 78.4
People 76.1
Outdoors 73.2
Coat 72.1
Car 71.3
Transportation 71.3
Vehicle 71.3
Automobile 71.3
Suit 71
Overcoat 71
Girl 64.3
Ground 63
Hat 62.3
Plant 62.2
Photography 61.1
Photo 61.1
Military Uniform 59.7
Military 59.7
Animal 59.7
Officer 59.3
Tire 58.3
Woman 57.4
Field 56.4
Crowd 55.9
Pants 55.6

Clarifai
created on 2023-10-26

people 99.9
group together 98.4
adult 98.3
group 97.6
man 96.5
three 96
child 93.9
four 93.9
administration 93.5
several 93.1
woman 93
war 92.9
outfit 92.5
two 92.3
five 88.3
military 87.8
wear 85.1
soldier 82.8
police 82.6
leader 82

Imagga
created on 2022-01-15

military uniform 71.9
uniform 66.6
clothing 48.3
man 29.5
consumer goods 29.4
covering 29.1
male 24.9
person 18.4
people 17.8
adult 17.5
military 16.4
danger 16.4
gun 16
sport 15.8
protection 15.5
soldier 14.7
outdoor 14.5
commodity 14.3
war 13.5
weapon 13.4
boy 13
outdoors 12.7
old 12.5
helmet 12.4
men 12
camouflage 11.9
day 11.8
mask 10.8
destruction 10.7
army 10.7
child 10.3
outside 10.3
dark 10
leisure 10
fun 9.7
nuclear 9.7
guy 9.4
rifle 9.2
sky 8.9
couple 8.7
work 8.6
walking 8.5
against 8.3
environment 8.2
industrial 8.2
recreation 8.1
activity 8.1
grass 7.9
disaster 7.8
black 7.8
hiking 7.7
risk 7.7
vintage 7.4
sports 7.4
world 7.3
life 7.2
history 7.2
game 7.1
athlete 7.1
portrait 7.1
mountain 7.1
travel 7

Google
created on 2022-01-15

Microsoft
created on 2022-01-15

clothing 98.4
man 94.8
person 91.7
outdoor 91.4
text 90.9
standing 83.7
footwear 74.7
black and white 59.4

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 54-64
Gender Male, 99.3%
Sad 48.2%
Happy 28.9%
Calm 14.1%
Confused 3.5%
Disgusted 2.6%
Surprised 1.1%
Angry 1%
Fear 0.7%

AWS Rekognition

Age 47-53
Gender Male, 98.9%
Sad 48%
Confused 17.2%
Calm 10.2%
Fear 8.3%
Disgusted 7.8%
Happy 5.5%
Surprised 1.6%
Angry 1.4%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Possible
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Possible

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Possible

Feature analysis

Amazon

Person 99.8%
Shoe 87%
Car 71.3%

Categories

Text analysis

Amazon

KODAK
-

Google

KODVK
KODVK