Human Generated Data

Title

Untitled (two men in small vintage automobile driving on road)

Date

1946

People

Artist: Jack Gould, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.14297

Human Generated Data

Title

Untitled (two men in small vintage automobile driving on road)

People

Artist: Jack Gould, American

Date

1946

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-29

Automobile 99.5
Vehicle 99.5
Transportation 99.5
Car 99.5
Person 96.8
Human 96.8
Machine 93.9
Wheel 93.9
Person 93.5
Person 93.3
Road 92.1
Car 91.1
Car 88.8
Tarmac 87.8
Asphalt 87.8
Car 84.8
Car 76.4
Person 71.2
Intersection 58.8
Wheel 57.8
Pedestrian 57
Coupe 56.2
Sports Car 56.2
City 55.8
Building 55.8
Urban 55.8
Town 55.8
Person 53.3
Person 42.5

Imagga
created on 2022-01-29

wheeled vehicle 31.5
shopping cart 27.9
motor vehicle 25.4
golf equipment 23.4
handcart 22.3
technology 20.8
equipment 19.9
vehicle 18.9
business 18.8
sports equipment 18
city 16.6
flight simulator 15.6
digital 15.4
architecture 14.8
3d 14.7
device 14.5
street 13.8
computer 13.7
road 13.5
container 13.4
modern 13.3
building 12.9
machine 12.7
simulator 12.5
urban 12.2
light 12
screen 11.7
conveyance 11.7
hand 11.4
high 11.3
travel 11.3
network 11.2
connection 11
sky 10.8
skyline 10.4
house 10
water 10
new 9.7
monitor 9.5
cityscape 9.5
town 9.3
global 9.1
transportation 9
tower 8.9
television 8.9
car 8.9
button 8.8
work 8.6
steamroller 8.6
structure 8.4
communication 8.4
future 8.4
transport 8.2
information 8
sea 7.8
corporate 7.7
center 7.7
construction 7.7
perspective 7.5
three dimensional 7.5
ocean 7.5
close 7.4
room 7.4
symbol 7.4
speed 7.3
data 7.3
home 7.2
night 7.1

Google
created on 2022-01-29

Microsoft
created on 2022-01-29

text 99.8
road 95.6
vehicle 92.8
ship 79.3
car 78.1
black and white 71.5
black 70.3
land vehicle 65.7
old 41.2
vintage 32.1
image 30.8

Face analysis

Amazon

AWS Rekognition

Age 27-37
Gender Male, 96.4%
Calm 98.8%
Sad 0.3%
Disgusted 0.3%
Surprised 0.2%
Confused 0.2%
Fear 0.1%
Angry 0.1%
Happy 0.1%

AWS Rekognition

Age 28-38
Gender Female, 94.2%
Fear 45.4%
Sad 18.5%
Confused 13.9%
Surprised 6.8%
Disgusted 4.9%
Happy 4.4%
Calm 4.2%
Angry 1.9%

Feature analysis

Amazon

Car 99.5%
Person 96.8%
Wheel 93.9%

Captions

Microsoft

a vintage photo of a person 85.6%
a vintage photo of a group of people in front of a building 81.7%
a vintage photo of a group of people standing in front of a building 77.7%

Text analysis

Amazon

GAS
66
GAS STATION
STATION
Phillips
ONE
ONE STOP
STOP
PHILLIPS
S
Badweiser
PROFESSION
FREE
STOP PROFESSION HAUMACIS
WZON
Maceu
M_IIF
M_IIF ACHAA
ACHAA
TEAKU
HAUMACIS
LOURITY

Google

YT3RA
MJIR
MJIR YT3RA