Human Generated Data

Title

Untitled (two young women posed at car with one on top)

Date

1955-1957

People

Artist: Martin Schweig, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.9588

Human Generated Data

Title

Untitled (two young women posed at car with one on top)

People

Artist: Martin Schweig, American 20th century

Date

1955-1957

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-23

Automobile 99.6
Car 99.6
Vehicle 99.6
Transportation 99.6
Person 98.6
Human 98.6
Antique Car 91.5
Sports Car 91.1
Tire 90.5
Wheel 89.2
Machine 89.2
Hot Rod 87.8
Car Wheel 79.4
Spoke 79
Alloy Wheel 76.9
Convertible 74.9
Coupe 74.6
Person 73
Race Car 55.3

Imagga
created on 2022-01-23

car 100
convertible 100
motor vehicle 97.2
vehicle 65
auto 57.4
automobile 54.6
speed 51.3
transportation 51.1
wheel 45.8
fast 39.3
transport 38.4
drive 35.9
luxury 35.2
wheeled vehicle 34
motor 32.9
road 32.5
sport 27.2
sports 26.8
power 26
engine 26
driving 25.1
headlight 24
style 23
race 22
design 22
expensive 21.1
tire 20.9
modern 19.6
shiny 19
chrome 18.8
model 17.9
automotive 17.7
travel 16.9
traffic 16.2
roadster 15.8
light 15.4
technology 14.8
street 14.7
reflection 14.6
elegant 14.6
bumper 14.2
sky 14
metallic 13.8
metal 13.7
wheels 13.7
performance 13.4
black 13.2
windshield 12.8
racing 12.7
old 11.8
vintage 11.6
machine 11.5
amphibian 11.1
sedan 10.8
retro 10.7
new 10.5
lamp 10.5
show 10.4
classic 10.2
tires 9.9
detail 9.7
highway 9.6
color 8.9
hood 8.8
cars 8.8
bright 8.6
seat 8.5
clouds 8.5
silver 8
lifestyle 8
urban 7.9
mirror 7.6
dream 7.6
screen 7.5
sports car 7.2

Google
created on 2022-01-23

Microsoft
created on 2022-01-23

text 98.9
car 95.8
black and white 90.4
outdoor 87.4
vehicle 79.6
black 67.1
land vehicle 53.1

Face analysis

Amazon

Google

AWS Rekognition

Age 20-28
Gender Female, 74.4%
Calm 53.8%
Sad 27.7%
Happy 15.5%
Angry 1.1%
Disgusted 0.8%
Confused 0.5%
Surprised 0.3%
Fear 0.2%

AWS Rekognition

Age 34-42
Gender Female, 95.2%
Calm 95.8%
Sad 2.2%
Happy 1.2%
Confused 0.3%
Surprised 0.2%
Disgusted 0.1%
Angry 0.1%
Fear 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Car 99.6%
Person 98.6%
Wheel 89.2%

Captions

Microsoft

a vintage photo of a person driving a car 85.3%
a vintage photo of a car 85.2%
a vintage photo of a person in a car 85.1%

Text analysis

Amazon

25184

Google

MJI7--YT 3RA°2 - - NAGOX
MJI7--YT
-
NAGOX
3RA°2