Human Generated Data

Title

Untitled (two soldiers inside control room of ship, Vietnam)

Date

1967-68

People

Artist: Gordon W. Gahan, American 1945 - 1984

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Barbara Robinson, 2007.184.2.97.3

Human Generated Data

Title

Untitled (two soldiers inside control room of ship, Vietnam)

People

Artist: Gordon W. Gahan, American 1945 - 1984

Date

1967-68

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Barbara Robinson, 2007.184.2.97.3

Machine Generated Data

Tags

Amazon
created on 2022-01-23

Person 99.7
Human 99.7
Person 99.6
Wheel 73.1
Machine 73.1
Shorts 72.5
Clothing 72.5
Apparel 72.5
Monitor 64.1
Electronics 64.1
Display 64.1
Screen 64.1
Icing 58.6
Food 58.6
Dessert 58.6
Cake 58.6
Cream 58.6
Creme 58.6
Porch 57.7
Furniture 57.3
Clinic 57.3

Clarifai
created on 2023-10-22

people 99.6
vehicle 99.1
adult 97.5
monochrome 95.7
transportation system 95.1
watercraft 95
man 94.3
one 93.1
woman 92.1
group together 87.3
two 84.9
indoors 84.4
technology 83.6
military 79.9
industry 79.8
aircraft 79.4
group 79.3
navy 78
outfit 77.2
three 76.1

Imagga
created on 2022-01-23

car 37
vehicle 29.3
transportation 27.8
man 22.2
device 20.9
aviator 19.2
male 18.4
people 17.8
person 17.8
passenger 17.6
equipment 17.4
transport 17.3
adult 16.8
automobile 16.3
outdoors 15.7
sitting 14.6
drive 14.2
driver 13.6
motor vehicle 13.3
smiling 13
inside 12.9
business 12.7
helm 12
cockpit 11.8
ambulance 11.6
travel 11.3
machine 11.2
driving 10.6
war 10.6
truck 10.5
auto 10.5
traffic 10.4
occupation 10.1
holding 9.9
uniform 9.8
women 9.5
day 9.4
outside 9.4
happy 9.4
industry 9.4
metal 8.8
interior 8.8
military 8.7
work 8.6
danger 8.2
road 8.1
stretcher 8.1
steering system 8
nurse 7.9
breathing device 7.8
accident 7.8
men 7.7
luxury 7.7
engine 7.7
professional 7.7
mechanism 7.6
technology 7.4
vacation 7.4
street 7.4
20s 7.3
speed 7.3
room 7.3
lifestyle 7.2
looking 7.2
oxygen mask 7.1
portrait 7.1

Microsoft
created on 2023-10-30

person 50.5

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 41-49
Gender Male, 99.7%
Calm 99.8%
Sad 0.1%
Surprised 0%
Angry 0%
Disgusted 0%
Confused 0%
Happy 0%
Fear 0%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.7%
Wheel 73.1%

Captions