Human Generated Data

Title

Untitled (group of soldiers with forklift, Vietnam)

Date

1967-68

People

Artist: Gordon W. Gahan, American 1945 - 1984

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Barbara Robinson, 2007.184.2.217.3

Human Generated Data

Title

Untitled (group of soldiers with forklift, Vietnam)

People

Artist: Gordon W. Gahan, American 1945 - 1984

Date

1967-68

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-23

Person 99.7
Human 99.7
Person 99.6
Person 99.5
Person 99.5
Person 99.4
Person 99
Person 98.8
Person 96.9
Astronaut 90.2
Wheel 88
Machine 88
Vehicle 73
Transportation 73
Sports Car 70.2
Car 70.2
Automobile 70.2
Truck 64.8
Tire 64.1
Person 56.2

Imagga
created on 2022-01-23

truck 47.6
motor vehicle 35.7
stage 26.4
garbage truck 22.7
trailer truck 21.8
platform 21.5
sky 20.4
stretcher 19.2
vehicle 18.6
car 17.4
transportation 17
wheeled vehicle 15.9
transport 15.5
industry 15.4
litter 15.3
man 14.3
travel 14.1
people 13.9
industrial 13.6
landscape 13.4
clouds 12.7
road 12.6
sunset 11.7
military 11.6
ocean 10.8
water 10.7
night 10.7
male 10.6
danger 10
silhouette 9.9
outdoors 9.7
war 9.6
sea 9.5
person 9.3
smoke 9.3
beach 9.3
dark 9.2
outdoor 9.2
city 9.1
protection 9.1
adult 9
trailer 9
black 9
cargo 8.7
motion 8.6
evening 8.4
tourism 8.2
environment 8.2
dirty 8.1
horizon 8.1
building 8.1
activity 8.1
sun 8
soldier 7.8
destruction 7.8
accident 7.8
toxic 7.8
protective 7.8
shipping 7.8
cloud 7.7
chemical 7.7
gas 7.7
outside 7.7
old 7.7
winter 7.7
machine 7.6
heavy 7.6
drive 7.6
field 7.5
leisure 7.5
symbol 7.4
light 7.3
world 7.3
limousine 7.3
coast 7.2
weapon 7.2
mountain 7.1

Microsoft
created on 2022-01-23

text 88.5
person 77.7
vehicle 76.3
man 64.9
land vehicle 64.5
black and white 64

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 18-24
Gender Female, 60.7%
Calm 72.6%
Surprised 8.1%
Disgusted 6.1%
Angry 5.1%
Sad 3.3%
Confused 2.5%
Happy 1.6%
Fear 0.8%

AWS Rekognition

Age 27-37
Gender Male, 92.9%
Calm 97.1%
Surprised 1%
Sad 0.6%
Fear 0.4%
Disgusted 0.3%
Angry 0.2%
Confused 0.2%
Happy 0.2%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.7%
Wheel 88%
Truck 64.8%

Captions

Microsoft

a group of people standing in front of a truck 75.9%
a group of people riding on the back of a truck 58.8%
a group of men riding on the back of a truck 49.9%

Text analysis

Amazon

CLARK
FORCE
AIR
US AIR FORCE
6757
US

Google

CL4INK US AIR FORCE 6757
CL4INK
US
AIR
FORCE
6757