Human Generated Data

Title

Untitled (family having picnic outside by car)

Date

c. 1950

People

Artist: Jack Rodden Studio, American 1914 - 2016

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.13855

Human Generated Data

Title

Untitled (family having picnic outside by car)

People

Artist: Jack Rodden Studio, American 1914 - 2016

Date

c. 1950

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-23

Wheel 99.5
Machine 99.5
Person 99.1
Human 99.1
Person 99
Person 98.8
Automobile 96.8
Vehicle 96.8
Transportation 96.8
Model T 96.1
Antique Car 96.1
Car 91.5
Tire 77.7
Spoke 75.8
Clothing 67.2
Apparel 67.2
People 63.6
Hot Rod 63.4
Coupe 59.4
Sports Car 59.4
Car Wheel 57.1
Overcoat 56
Coat 56
Wheel 54.1

Imagga
created on 2022-01-23

car 100
vehicle 100
model t 73.1
half track 59.9
motor vehicle 58.1
military vehicle 48.3
tracked vehicle 47.9
wheeled vehicle 45
transportation 39.5
auto 34.5
transport 29.2
drive 27.4
wheel 27.1
truck 26.8
automobile 26.8
old 24.4
machine 24.3
tractor 23.7
driving 21.3
machinery 19.5
engine 19.3
equipment 18.9
tire 18.8
road 16.3
industry 15.4
industrial 14.5
power 14.3
rural 14.1
tires 13.8
motor 13.6
sky 13.4
work 13.3
vintage 13.2
classic 13
speed 12.8
retro 12.3
antique 12
construction 12
earth 11.9
heavy 11.5
travel 11.3
yellow 11.3
field 10.9
sport 10.7
working 10.6
agriculture 10.5
dirt 10.5
metal 10.5
sports 10.2
model 10.1
farm 9.8
snow 9.5
adventure 9.5
land 9.2
style 8.9
wheels 8.8
soil 8.8
conveyance 8.8
grass 8.7
rust 8.7
luxury 8.6
winter 8.5
bulldozer 8.3
plow 8.3
jeep 8.3
dirty 8.1
headlight 8
bumper 8
sand 7.9
driver 7.8
military 7.7
tool 7.7
farming 7.6
fast 7.5
landscape 7.4
design 7.3
modern 7

Google
created on 2022-01-23

Microsoft
created on 2022-01-23

land vehicle 98
vehicle 97.1
outdoor 93.4
wheel 91.9
text 89.4
car 85.3
old 84
person 83.7
clothing 67.8
tire 60.4
man 53.5
vintage 31.9

Face analysis

Amazon

Google

AWS Rekognition

Age 26-36
Gender Female, 89.6%
Happy 95.6%
Fear 2%
Sad 0.7%
Surprised 0.5%
Angry 0.3%
Confused 0.3%
Calm 0.3%
Disgusted 0.3%

AWS Rekognition

Age 34-42
Gender Female, 99.9%
Calm 70.4%
Sad 22.6%
Disgusted 2.8%
Confused 1.3%
Surprised 1.1%
Fear 0.8%
Angry 0.6%
Happy 0.4%

AWS Rekognition

Age 6-12
Gender Female, 97.2%
Angry 33%
Surprised 16.4%
Happy 16%
Disgusted 14.5%
Sad 8.7%
Calm 4.9%
Confused 4.5%
Fear 2%

AWS Rekognition

Age 2-8
Gender Female, 99.9%
Happy 95.5%
Calm 0.9%
Surprised 0.9%
Fear 0.9%
Angry 0.7%
Disgusted 0.6%
Sad 0.3%
Confused 0.3%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very likely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Likely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very likely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Wheel 99.5%
Person 99.1%
Car 91.5%

Captions

Microsoft

a vintage photo of a truck 91.2%
a vintage photo of a man driving a car 73.9%
a vintage photo of a man riding on the back of a truck 72.6%

Text analysis

Google

ME
ME