Human Generated Data

Title

Untitled (couple posed next to early model car by grassy hill)

Date

c. 1920-1940, printed later

People

Artist: O. B. Porter Studio, American active 1930s-1940s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.11176

Human Generated Data

Title

Untitled (couple posed next to early model car by grassy hill)

People

Artist: O. B. Porter Studio, American active 1930s-1940s

Date

c. 1920-1940, printed later

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2019-11-16

Machine 99.5
Wheel 99.5
Vehicle 99.5
Automobile 99.5
Antique Car 99.5
Model T 99.5
Transportation 99.5
Person 99.2
Human 99.2
Car 98.9
Person 98.2
Wheel 97.9
Hot Rod 86.6
Tire 66.1

Clarifai
created on 2019-11-16

vehicle 100
transportation system 99.7
people 99.5
car 99.1
driver 98.2
group together 97.3
two 96.1
nostalgia 95.1
man 95
adult 94.9
group 93
vintage 92.5
one 91.7
engine 91.3
four 88.6
convertible 88.2
truck 85.6
war 82.5
military 81.7
three 81.7

Imagga
created on 2019-11-16

car 100
motor vehicle 100
model t 100
vehicle 44.1
wheeled vehicle 39.6
auto 34.5
transportation 33.2
automobile 26.8
road 26.2
truck 25.2
transport 24.7
wheel 24.5
old 23
drive 22.7
rural 16.7
driving 16.4
machine 16.1
sky 15.3
speed 14.7
field 14.2
landscape 14.1
classic 13.9
antique 13.9
tractor 13.8
machinery 13.6
travel 13.4
work 13.3
vintage 13.2
retro 13.1
tire 12.9
motor 12.6
farm 12.5
agriculture 12.3
equipment 12.3
grass 11.9
engine 11.6
sport 10.7
dirt 10.5
heavy 10.5
adventure 10.4
4x4 9.9
tires 9.9
wheels 9.8
industry 9.4
cars 8.8
hay 8.8
luxury 8.6
model 8.6
outdoor 8.4
power 8.4
summer 8.4
land 8.3
outdoors 8.2
off road 7.9
traffic 7.6
farming 7.6
sports 7.4
yellow 7.3
industrial 7.3
metal 7.2
dirty 7.2
country 7

Google
created on 2019-11-16

Microsoft
created on 2019-11-16

land vehicle 98.3
vehicle 97.4
wheel 93.7
old 90.1
car 89.9
outdoor 88.9
text 88.8
window 82.7
vintage car 59.7

Face analysis

Amazon

AWS Rekognition

Age 32-48
Gender Male, 54.9%
Angry 45%
Happy 54.9%
Disgusted 45%
Fear 45%
Confused 45%
Surprised 45%
Calm 45.1%
Sad 45%

AWS Rekognition

Age 30-46
Gender Female, 53.7%
Fear 45%
Sad 45%
Happy 53.7%
Disgusted 45%
Calm 46.2%
Confused 45%
Angry 45%
Surprised 45%

Feature analysis

Amazon

Wheel 99.5%
Person 99.2%
Car 98.9%

Captions

Microsoft

a vintage photo of a truck 85.3%
a vintage photo of a truck window 85.2%
a vintage photo of a car window 77.2%

Text analysis

Amazon

90-318

Google

90-318
90-318