Human Generated Data

Title

Untitled (men driving old car down city street)

Date

c. 1950

People

Artist: Jack Gould, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.15359

Human Generated Data

Title

Untitled (men driving old car down city street)

People

Artist: Jack Gould, American

Date

c. 1950

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-03-05

Automobile 99.8
Vehicle 99.8
Transportation 99.8
Car 99.8
Person 98.7
Human 98.7
Person 98.7
Car 95.6
Tarmac 93.3
Asphalt 93.3
Person 89
Person 88.8
Road 88
Wheel 81
Machine 81
Person 74.3
Sports Car 73
Helmet 61.7
Clothing 61.7
Apparel 61.7
Antique Car 59.4
Model T 56.8
Hot Rod 55.3
Wheel 53.7
Person 43.7

Imagga
created on 2022-03-05

sports equipment 100
golf equipment 100
motor vehicle 100
wheeled vehicle 76.3
equipment 71.4
vehicle 54.3
transportation 27.8
car 27.2
road 27.1
transport 22.8
travel 21.1
street 18.4
city 17.4
highway 17.4
traffic 17.1
speed 15.6
drive 15.1
urban 14.8
sky 14.7
automobile 14.4
auto 13.4
business 11.5
landscape 11.2
technology 11.1
motion 11.1
sea 10.9
ocean 10.8
driving 10.6
scene 10.4
architecture 10.1
vacation 9.8
new 9.7
truck 9.7
water 9.3
outdoor 9.2
island 9.1
people 8.9
light 8.7
engine 8.7
industry 8.5
tree 8.5
land 8.3
tourist 8.2
coast 8.1
activity 8.1
asphalt 7.8
marina 7.8
moving 7.6
wheel 7.5
trip 7.5
fast 7.5
boat 7.4
blur 7.4
tourism 7.4
sports 7.4
building 7.1
trees 7.1
summer 7.1
modern 7

Microsoft
created on 2022-03-05

text 99.5
vehicle 94.3
land vehicle 88.1
black and white 72.1
white 67.2
black 66.5
wheel 65.9
old 52.4
vintage 31.3

Face analysis

Amazon

AWS Rekognition

Age 23-31
Gender Female, 99.6%
Happy 70.2%
Fear 8.9%
Calm 5.5%
Angry 5.2%
Sad 3.3%
Surprised 2.9%
Disgusted 2.4%
Confused 1.6%

AWS Rekognition

Age 27-37
Gender Male, 99.9%
Happy 57.6%
Calm 20.5%
Fear 5.3%
Disgusted 5.1%
Sad 4.1%
Confused 4.1%
Surprised 1.7%
Angry 1.6%

AWS Rekognition

Age 24-34
Gender Male, 85.8%
Happy 35.7%
Calm 29.8%
Sad 9.7%
Confused 9%
Disgusted 5%
Fear 4.9%
Angry 4.8%
Surprised 1.2%

AWS Rekognition

Age 27-37
Gender Male, 89%
Sad 55.2%
Calm 22.9%
Confused 10.1%
Fear 2.8%
Disgusted 2.7%
Surprised 2.5%
Happy 2.2%
Angry 1.6%

Feature analysis

Amazon

Car 99.8%
Person 98.7%
Wheel 81%
Helmet 61.7%

Captions

Microsoft

a vintage photo of a store 74.9%
a vintage photo of a building 74.8%
a vintage photo of a group of people standing in front of a building 61.6%

Text analysis

Amazon

SALES
SERVICE
AKER
5
M 117
M 117 YE3AD ACHA
ACHA
YE3AD

Google

MJ7
YT3RA
A
MJ7 YT3RA A