Human Generated Data

Title

Untitled (couple sitting in car at drive-in)

Date

1956

People

Artist: Samuel Cooper, American active 1950s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.19603

Human Generated Data

Title

Untitled (couple sitting in car at drive-in)

People

Artist: Samuel Cooper, American active 1950s

Date

1956

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.19603

Machine Generated Data

Tags

Amazon
created on 2022-03-05

Bumper 99.3
Vehicle 99.3
Transportation 99.3
Sedan 98.4
Automobile 98.4
Sunglasses 97.1
Accessories 97.1
Accessory 97.1
Person 97
Human 97
Clothing 97
Apparel 97
Face 96.6
Car 93.9
Car 89.7
Car 86.4
Car 85.3
Car 82.9
Sports Car 79.7
Portrait 77.7
Photography 77.7
Photo 77.7
Helmet 72.7
Car 71.9
Female 64
Suit 63.6
Overcoat 63.6
Coat 63.6
Man 60.1
Coupe 59.8
Car Dealership 56.8
Glasses 56.7
Selfie 55.7

Clarifai
created on 2023-10-22

car 100
vehicle 99.9
people 99.8
transportation system 99.7
adult 99
convertible 98.5
driver 98.1
two 97.1
man 96.7
woman 94.9
facial expression 94.6
three 92
one 91.2
hood 90.5
monochrome 89.9
vehicle window 88.6
windshield 87.6
sitting 87.4
aircraft 86.8
group together 84.6

Imagga
created on 2022-03-05

car 100
convertible 67.3
vehicle 54.7
motor vehicle 52.9
automobile 45
transportation 44.8
auto 38.3
driver 33
drive 32.2
transport 31.1
driving 30.9
road 28
sitting 24.1
wheel 23.7
man 22.8
person 21.3
speed 20.2
seat 19.9
male 17.7
wheeled vehicle 16.8
motor 16.5
luxury 16.3
aviator 16.3
people 16.2
happy 15.7
adult 15.5
device 15.5
traffic 14.3
outdoors 14.2
travel 14.1
inside 13.8
fast 13.1
engine 12.5
speedboat 12.5
new 12.1
boat 11.8
portrait 11.7
business 11.5
street 11
motorboat 11
windshield 11
smiling 10.9
smile 10.7
looking 10.4
happiness 10.2
bumper 10.1
support 9.8
black 9.6
motion 9.4
sports 9.2
hand 9.1
cockpit 9.1
highway 8.7
outside 8.6
modern 8.4
mature 8.4
holding 8.3
occupation 8.2
headlight 8.1
mirror 8.1
light 8
job 8
hood 7.9
cars 7.8
middle aged 7.8
elegant 7.7
attractive 7.7
professional 7.6
sport 7.4
machine 7.4
hat 7.3
vessel 7.2
model 7

Microsoft
created on 2022-03-05

text 98.4
vehicle 95.1
car 94.5
land vehicle 90.8
black and white 75.6
automotive 68.3
wheel 60.2

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 45-53
Gender Female, 87.4%
Happy 94.6%
Calm 4.4%
Surprised 0.4%
Confused 0.2%
Angry 0.1%
Disgusted 0.1%
Sad 0.1%
Fear 0%

AWS Rekognition

Age 25-35
Gender Male, 93.7%
Happy 87.9%
Calm 7.8%
Surprised 3.1%
Disgusted 0.4%
Sad 0.2%
Fear 0.2%
Confused 0.2%
Angry 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Possible
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Sunglasses
Person
Car
Helmet
Sunglasses 97.1%
Person 97%
Car 93.9%
Car 89.7%
Car 86.4%
Car 85.3%
Car 82.9%
Car 71.9%
Helmet 72.7%

Text analysis

Amazon

8.5
G