Human Generated Data

Title

Untitled (family with farm equipment; tractors, hay pile)

Date

c. 1950

People

Artist: Mary Lowber Tiers, American active 1940s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.15863

Human Generated Data

Title

Untitled (family with farm equipment; tractors, hay pile)

People

Artist: Mary Lowber Tiers, American active 1940s

Date

c. 1950

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-02-05

Human 99.6
Person 99.6
Person 99.2
Machine 97
Wheel 97
Person 95.5
Text 94.8
Person 92
Person 85.8
Clothing 84.3
Apparel 84.3
Person 77.8
Meal 69.9
Food 69.9
Bus Stop 65.8
Wheel 65.7
Vehicle 59.1
Car 59.1
Transportation 59.1
Automobile 59.1

Imagga
created on 2022-02-05

newspaper 38
grunge 31.5
negative 30.9
film 29.9
vintage 29.8
old 29.2
product 29.1
monitor 28.1
creation 23.7
retro 22.9
black 21.6
texture 21.5
grungy 19.9
antique 19.9
paint 19
screen 19
art 19
frame 18.3
pattern 17.1
rough 16.4
border 16.3
dirty 16.3
ancient 15.6
aged 15.4
graphic 15.3
space 14.7
design 14.6
photographic paper 14.2
computer 14.1
text 14
material 13.4
decoration 13.1
building 12.9
collage 12.5
rust 12.5
damaged 12.4
electronic equipment 12
photographic 11.8
noise 11.7
messy 11.6
man 11.4
digital 11.3
architecture 10.9
noisy 10.8
layered 10.8
slide 10.7
light 10.7
movie 10.7
structure 10.6
edge 10.6
mask 10.5
paper 10.2
equipment 10.1
city 10
designed 9.8
mess 9.8
frames 9.8
scratch 9.8
strip 9.7
layer 9.7
television 9.6
urban 9.6
photographic equipment 9.6
weathered 9.5
window 9.3
office 9.2
business 9.1
silhouette 9.1
people 8.9
brown 8.8
your 8.7
water 8.7
dirt 8.6
wall 8.5
color 8.3
grain 8.3
letter 8.2
overlay 7.9
circa 7.9
textured 7.9
scratches 7.9
burned 7.8
artistic 7.8
wallpaper 7.7
dark 7.5
landscape 7.4
camera 7.4
copy space 7.2
male 7.1
working 7.1
businessman 7.1

Google
created on 2022-02-05

Microsoft
created on 2022-02-05

text 99.6
black and white 84.9
person 72.9
house 62.5
old 57.7
tree 56.9

Face analysis

Amazon

Google

AWS Rekognition

Age 23-33
Gender Male, 95.9%
Sad 35.4%
Fear 26.3%
Calm 9.8%
Happy 7.4%
Disgusted 7.3%
Confused 6.9%
Angry 4.5%
Surprised 2.3%

AWS Rekognition

Age 23-31
Gender Male, 59.5%
Happy 69.4%
Calm 12.8%
Sad 9.7%
Angry 4.5%
Fear 1.3%
Confused 1.1%
Disgusted 0.8%
Surprised 0.5%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Possible

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Unlikely

Feature analysis

Amazon

Person 99.6%
Wheel 97%
Car 59.1%

Captions

Microsoft

an old photo of a group of people in a room 69.3%
old photo of a group of people in a room 65.8%
an old photo of a person 54.6%

Text analysis

Amazon

KODAK
SAFETY
KODAK SAFETY FILM
12
FILM

Google

KODAK
12
FILM
S'AFETY
12 KODAK S'AFETY FILM