Human Generated Data

Title

Untitled (man standing by car)

Date

c. 1945

People

Artist: Robert Burian, American active 1940s-1950s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.19137

Human Generated Data

Title

Untitled (man standing by car)

People

Artist: Robert Burian, American active 1940s-1950s

Date

c. 1945

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.19137

Machine Generated Data

Tags

Amazon
created on 2022-03-05

Car 99.7
Automobile 99.7
Vehicle 99.7
Transportation 99.7
Person 98.8
Human 98.8
Wheel 91.6
Machine 91.6
Wheel 88.9
Nature 83.3
Road 83
Tarmac 82.9
Asphalt 82.9
Weather 69.6
Outdoors 66
Spoke 63.2
Tire 62.1
Truck 60
Pickup Truck 55.7

Clarifai
created on 2023-10-22

monochrome 99.7
car 99.5
street 99.1
people 98.1
beach 97.5
vehicle 97.2
transportation system 96.6
city 95.7
light 95
rain 93.5
landscape 93.5
black and white 93.2
travel 91.4
fog 91.1
sea 91
water 90.2
vintage 90.1
boat 90.1
sepia 89.7
winter 89.7

Imagga
created on 2022-03-05

airport 39.1
car 38.4
vehicle 34.9
motor vehicle 34
jet 27.6
transportation 26
airfield 24.9
aircraft 24.3
craft 23.9
boat 23.8
sky 23.6
airplane 22.7
racer 22
travel 21.8
sea 21.1
water 20.7
plane 18.4
hovercraft 18.1
facility 17.2
amphibian 16.9
sunset 16.2
transport 15.5
ocean 14.9
landscape 14.9
vacation 14.7
wheeled vehicle 14.6
engine 14.4
device 14.1
airliner 13.8
speed 13.7
sun 13.7
river 13.4
air 12.9
power 12.6
flight 12.5
auto 12.4
beach 11.8
coast 11.7
airfoil 11.6
vessel 11.1
automobile 10.5
old 10.5
ship 10.3
summer 10.3
fast 10.3
lake 10.1
outdoor 9.9
sand 9.8
dusk 9.5
flying 9.5
sunrise 9.4
tourism 9.1
warplane 8.8
aviation 8.8
yacht 8.8
cruise 8.8
day 8.6
orange 8.4
fly 8.4
city 8.3
vintage 8.3
reflection 8.1
horizon 8.1
military vehicle 8
building 8
cockpit 8
wing 7.9
luxury 7.7
motion 7.7
tropical 7.7
truck 7.7
clouds 7.6
traffic 7.6
drive 7.6
leisure 7.5
outdoors 7.5
propeller 7.5
silhouette 7.5
light 7.4
trimaran 7.3
tranquil 7.2
road 7.2
holiday 7.2
activity 7.2

Microsoft
created on 2022-03-05

text 99.3
outdoor 92.9
fog 92.3
vehicle 91.9
white 80
black and white 78.6
car 75.2
land vehicle 72.8

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 36-44
Gender Male, 99.6%
Happy 71.7%
Calm 24.9%
Surprised 1.9%
Disgusted 0.6%
Angry 0.3%
Sad 0.2%
Confused 0.2%
Fear 0.2%

Feature analysis

Amazon

Car
Person
Wheel
Car 99.7%
Person 98.8%
Wheel 91.6%
Wheel 88.9%

Categories

Imagga

cars vehicles 93.9%
interior objects 5.4%

Text analysis

Amazon

S
MJI7
MJI7 YT3RAS
YT3RAS

Google

の 1日 YT33A2
1
YT33A2