Human Generated Data

Title

Untitled (man standing by car)

Date

c. 1945

People

Artist: Robert Burian, American active 1940s-1950s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.19137

Human Generated Data

Title

Untitled (man standing by car)

People

Artist: Robert Burian, American active 1940s-1950s

Date

c. 1945

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-03-05

Vehicle 99.7
Automobile 99.7
Car 99.7
Transportation 99.7
Person 98.8
Human 98.8
Wheel 91.6
Machine 91.6
Wheel 88.9
Nature 83.3
Road 83
Asphalt 82.9
Tarmac 82.9
Weather 69.6
Outdoors 66
Spoke 63.2
Tire 62.1
Truck 60
Pickup Truck 55.7

Imagga
created on 2022-03-05

airport 39.1
car 38.4
vehicle 34.9
motor vehicle 34
jet 27.6
transportation 26
airfield 24.9
aircraft 24.3
craft 23.9
boat 23.8
sky 23.6
airplane 22.7
racer 22
travel 21.8
sea 21.1
water 20.7
plane 18.4
hovercraft 18.1
facility 17.2
amphibian 16.9
sunset 16.2
transport 15.5
ocean 14.9
landscape 14.9
vacation 14.7
wheeled vehicle 14.6
engine 14.4
device 14.1
airliner 13.8
speed 13.7
sun 13.7
river 13.4
air 12.9
power 12.6
flight 12.5
auto 12.4
beach 11.8
coast 11.7
airfoil 11.6
vessel 11.1
automobile 10.5
old 10.5
ship 10.3
summer 10.3
fast 10.3
lake 10.1
outdoor 9.9
sand 9.8
dusk 9.5
flying 9.5
sunrise 9.4
tourism 9.1
warplane 8.8
aviation 8.8
yacht 8.8
cruise 8.8
day 8.6
orange 8.4
fly 8.4
city 8.3
vintage 8.3
reflection 8.1
horizon 8.1
military vehicle 8
building 8
cockpit 8
wing 7.9
luxury 7.7
motion 7.7
tropical 7.7
truck 7.7
clouds 7.6
traffic 7.6
drive 7.6
leisure 7.5
outdoors 7.5
propeller 7.5
silhouette 7.5
light 7.4
trimaran 7.3
tranquil 7.2
road 7.2
holiday 7.2
activity 7.2

Microsoft
created on 2022-03-05

text 99.3
outdoor 92.9
fog 92.3
vehicle 91.9
white 80
black and white 78.6
car 75.2
land vehicle 72.8

Face analysis

Amazon

AWS Rekognition

Age 36-44
Gender Male, 99.6%
Happy 71.7%
Calm 24.9%
Surprised 1.9%
Disgusted 0.6%
Angry 0.3%
Sad 0.2%
Confused 0.2%
Fear 0.2%

Feature analysis

Amazon

Car 99.7%
Person 98.8%
Wheel 91.6%

Captions

Microsoft

a car parked on the side of a road 79.9%
a car parked on the side of the road 79.6%
a truck parked on the side of a road 79.5%

Text analysis

Amazon

S
MJI7
MJI7 YT3RAS
YT3RAS

Google

の 1日 YT33A2
1
YT33A2