Human Generated Data

Title

Untitled (Truck, Embarcadero, San Francisco)

Date

August 19, 1949

People

Artist: Minor White, American 1908 - 1976

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Anonymous Loan, 3.1994.164

Copyright

© The Trustees of Princeton University

Human Generated Data

Title

Untitled (Truck, Embarcadero, San Francisco)

People

Artist: Minor White, American 1908 - 1976

Date

August 19, 1949

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-30

Human 96.6
Person 96.6
Automobile 81.5
Vehicle 81.5
Transportation 81.5
Car 81.5
Advertisement 78.8
Overcoat 78.1
Apparel 78.1
Coat 78.1
Clothing 78.1
Poster 77.6
Suit 77.2
Wheel 67.5
Machine 67.5
Tire 63
Airplane 61.9
Aircraft 61.9
Flyer 59
Brochure 59
Paper 59
Spoke 56
Text 55.2

Imagga
created on 2022-01-30

device 38.2
equipment 37.4
home appliance 34.9
sewing machine 34.5
appliance 32.6
machine 29.9
textile machine 27.6
technology 27.4
box 22.8
safe 21.3
telephone 18.8
durables 16.9
metal 15.3
container 15.1
computer 14.4
locker 13.9
industry 13.7
work 13.3
strongbox 12.8
home 12.8
electronic equipment 12.3
phone 12
call 11.8
fastener 11.8
communication 11.7
people 11.7
digital 11.3
blister pack 10.9
washing 10.7
silver 10.6
indoors 10.5
person 10.1
power 10.1
clean 10
steel 9.7
cable 9.5
object 9.5
electronic 9.3
washer 9.3
inside 9.2
restraint 9.1
business 9.1
holding 9.1
adult 9
office 8.8
household 8.7
packaging 8.6
tools 8.5
3d 8.5
old 8.4
occupation 8.2
data 8.2
open 8.1
close 8
job 8
interior 8
render 7.8
dial 7.7
electrical 7.7
communications 7.7
three dimensional 7.5
one 7.5
man 7.4
room 7.4
safety 7.4
light 7.3
security 7.3
industrial 7.3
domestic 7.2
worker 7.1

Google
created on 2022-01-30

Microsoft
created on 2022-01-30

text 99.3
vehicle 80.5
land vehicle 76.8

Face analysis

Amazon

Google

AWS Rekognition

Age 45-53
Gender Male, 99.2%
Calm 99.8%
Surprised 0.1%
Sad 0%
Disgusted 0%
Confused 0%
Happy 0%
Angry 0%
Fear 0%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 96.6%
Car 81.5%
Wheel 67.5%
Airplane 61.9%

Captions

Microsoft

an old photo of a person 45.4%
an old photo of a person 43.8%
old photo of a person 43.5%

Text analysis

Amazon

and
take
they
THE
Man
192
MAN
his
makes
IS
of
ONLY
in
he
own
NOT
MAN HIMSELF IS NOT THE ONLY OBJECT.
on
some
OBJECT.
them
Не makes things, they take on some of his characteristics.
machi
Не
HIMSELF
image.
things,
often he makes them in his own image.
N
Man and his machi ines
characteristics.
often
ines

Google

MAN
NOT
ONLY
makes
thinge,
they
in
own
THE
some
his
he
image.
take
cherecterietice.
Man
machi
OBJECT.
on
of
and
nes
192 MAN HIMSELF IS NOT THE ONLY OBJECT. He makes thinge, they take on some of his cherecterietice. Often he makes them in his own image. Man and hie machi nes FFER
192
HIMSELF
IS
He
Often
them
hie
FFER