Human Generated Data

Title

Untitled (young woman examining map while sitting in car next to school field)

Date

1955-1957

People

Artist: Martin Schweig, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.9591

Human Generated Data

Title

Untitled (young woman examining map while sitting in car next to school field)

People

Artist: Martin Schweig, American 20th century

Date

1955-1957

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-23

Person 98.3
Human 98.3
Vehicle 91
Transportation 91
Automobile 81.9
Clothing 81.9
Apparel 81.9
Car 75.7
Convertible 75.4
Driving 65
Vessel 63.7
Watercraft 63.7
Person 59.2
Boat 57.3

Imagga
created on 2022-01-23

convertible 100
motor vehicle 100
car 100
vehicle 44.5
transportation 38.5
wheeled vehicle 34.6
automobile 33.5
auto 33.5
drive 26.5
transport 26.5
wheel 25.7
driving 25.1
road 22.6
speed 21.1
travel 19.7
driver 16.5
seat 15.2
happy 13.8
luxury 13.7
adult 13.6
highway 13.5
people 13.4
sitting 12.9
traffic 12.3
fast 12.2
sky 12.1
motor 11.6
engine 11.6
modern 11.2
business 10.9
smile 10.7
mirror 10.5
sports 10.2
man 10.1
model 9.3
casual 9.3
power 9.2
leisure 9.1
person 8.9
boat 8.9
metal 8.8
smiling 8.7
lifestyle 8.7
attractive 8.4
summer 8.4
tourism 8.2
outdoors 8.2
reflection 8.1
male 7.8
portrait 7.8
motion 7.7
pretty 7.7
outdoor 7.6
race 7.6
moving 7.6
trip 7.5
fashion 7.5
sport 7.4
street 7.4
vacation 7.4
safety 7.4
light 7.4
device 7.2
day 7.1
sea 7

Microsoft
created on 2022-01-23

black and white 92.8
text 92.7
vehicle 84.3
monochrome 79.6

Face analysis

Google

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 98.3%
Car 75.7%

Text analysis

Amazon

25184

Google

T3RA
2--XAGO
25I8 t MJI7--Y T3RA 2--XAGO
25I8
t
MJI7--Y