Human Generated Data

Title

Untitled (several men two in US Army jeeps on wide field with trees in background)

Date

c. 1950

People

Artist: Jack Rodden Studio, American 1914 - 2016

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.13409

Human Generated Data

Title

Untitled (several men two in US Army jeeps on wide field with trees in background)

People

Artist: Jack Rodden Studio, American 1914 - 2016

Date

c. 1950

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-03-05

Transportation 98.8
Vehicle 98.8
Truck 98.8
Machine 98.5
Wheel 98.5
Wheel 97.1
Person 96.7
Human 96.7
Person 94.5
Wheel 92.8
Person 89.7
Person 80.7
Half Track 79
Person 78.9
Person 74.2
Person 72.5
Car 70.1
Automobile 70.1
Wheel 68.3
Fire Truck 57.8
People 55.8
Street 55.5
Building 55.5
Road 55.5
City 55.5
Urban 55.5
Town 55.5

Imagga
created on 2022-03-05

motor vehicle 100
fire engine 100
truck 100
vehicle 66.1
tow truck 42.2
wheeled vehicle 41.3
transportation 37.7
car 33.4
transport 32
road 26.2
machine 24
machinery 23.4
wheel 22.7
driving 21.3
auto 21.1
tractor 20.8
equipment 20.4
engine 20.2
tire 19.5
industrial 19.1
industry 18.8
automobile 18.2
heavy 18.1
work 16.5
drive 16.1
old 15.3
dirt 15.3
cargo 14.6
fire 14.1
lorry 13.6
farm 13.4
half track 13.4
construction 12.8
diesel 12.8
rescue 12.7
wheels 12.7
motor 12.6
outdoor 12.2
sky 12.1
landscape 11.9
power 11.8
emergency 11.6
military vehicle 11.2
grass 11.1
dirty 10.9
load 10.8
tracked vehicle 10.8
rural 10.6
site 10.3
land 10.1
danger 10
yellow 9.9
delivery 9.7
working 9.7
driver 9.7
metal 9.7
agriculture 9.7
trailer 9.6
safety 9.2
speed 9.2
retro 9
fireman 8.9
trucking 8.9
tires 8.9
travel 8.5
vintage 8.3
building 7.9
semi 7.9
4x4 7.9
hose 7.9
carrier 7.9
deliver 7.9
freight 7.8
highway 7.7
track 7.7
move 7.7
dangerous 7.6
ground 7.6
adventure 7.6
farming 7.6
field 7.5
earth 7.3
protection 7.3
activity 7.2

Google
created on 2022-03-05

Microsoft
created on 2022-03-05

truck 100
text 99.5
tree 98.2
outdoor 96.1
vehicle 94.2
land vehicle 93.3
wheel 87.3
old 78.1
military vehicle 73.5
transport 66.9
white 62.3
auto part 53.4

Face analysis

Amazon

Google

AWS Rekognition

Age 23-33
Gender Female, 98.8%
Calm 47.2%
Fear 14.7%
Happy 11.6%
Sad 10.3%
Surprised 5.7%
Disgusted 4.1%
Confused 3.4%
Angry 2.9%

AWS Rekognition

Age 6-16
Gender Male, 81%
Calm 46.3%
Surprised 43.4%
Fear 7.5%
Happy 0.8%
Angry 0.8%
Disgusted 0.6%
Sad 0.3%
Confused 0.3%

AWS Rekognition

Age 35-43
Gender Female, 79.2%
Calm 81.7%
Surprised 5.9%
Happy 5.8%
Angry 2.5%
Fear 2%
Disgusted 1.1%
Sad 0.5%
Confused 0.4%

AWS Rekognition

Age 36-44
Gender Male, 98.3%
Calm 75.1%
Sad 10.5%
Confused 6.8%
Disgusted 4.1%
Happy 1.1%
Surprised 1.1%
Fear 0.7%
Angry 0.6%

AWS Rekognition

Age 34-42
Gender Male, 97.1%
Calm 69.5%
Disgusted 11.7%
Sad 9.1%
Happy 3.4%
Surprised 3.1%
Fear 1.5%
Confused 0.8%
Angry 0.8%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Possible

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Truck 98.8%
Wheel 98.5%
Person 96.7%

Captions

Microsoft

a vintage photo of a truck 96.3%
an old photo of a truck 96.2%
old photo of a truck 95%

Text analysis

Amazon

35
2
20892713
C
C 1
1
I
KODAKA-ITW
S.ARME

Google

MJI7-- YT37A°2 - - NAGO
YT37A°2
-
MJI7--
NAGO