Human Generated Data

Title

Untitled (two men looking into shop window at mysterious vacuum display)

Date

1948

People

Artist: Jack Gould, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.14613

Human Generated Data

Title

Untitled (two men looking into shop window at mysterious vacuum display)

People

Artist: Jack Gould, American

Date

1948

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-29

Human 99.3
Person 99.3
Car 99.2
Automobile 99.2
Vehicle 99.2
Transportation 99.2
Person 99.2
Apparel 98.3
Clothing 98.3
Person 98
Wheel 90
Machine 90
Appliance 78.4
Shorts 75
Food 69.2
Dish 69.2
Meal 69.2
Pants 66.1
Female 64.5
Poster 61.6
Advertisement 61.6
Overcoat 60.6
Coat 60.6
Door 58.4
Suit 58.2

Imagga
created on 2022-01-29

device 33.4
iron lung 24.8
equipment 24.2
machine 23.3
room 21.1
home 19.9
respirator 19.8
man 18.8
interior 18.6
person 18.5
medical 17.6
breathing device 17
kitchen 16.7
work 16.5
male 16.3
hospital 16.1
technology 15.6
people 15
professional 14.4
clean 14.2
patient 13.6
house 13.4
working 13.2
occupation 12.8
doctor 12.2
job 11.5
medicine 11.4
radio 11.3
health 11.1
indoors 10.5
modern 10.5
toilet 10.4
holding 9.9
worker 9.8
clinic 9.7
bathroom 9.6
men 9.4
industry 9.4
adult 9.1
industrial 9.1
appliance 9
instrument 8.9
steel 8.8
washing 8.7
furniture 8.7
lifestyle 8.7
uniform 8.5
surgeon 8.5
business 8.5
nurse 8.4
floor 8.4
connection 8.2
slicer 8.2
metal 8
sterile 7.9
surgery 7.8
electronic equipment 7.7
wash 7.7
old 7.7
hand 7.6
office 7.4
apparatus 7.3
computer 7.2
sink 7.1

Microsoft
created on 2022-01-29

text 99.5
wheel 68.6
black and white 58.3

Face analysis

Amazon

Google

AWS Rekognition

Age 47-53
Gender Male, 100%
Calm 97.3%
Angry 1.1%
Surprised 0.6%
Sad 0.6%
Disgusted 0.2%
Confused 0.1%
Happy 0%
Fear 0%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Likely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.3%
Car 99.2%
Wheel 90%

Captions

Microsoft

an old photo of a person 62.2%
a person standing in front of a building 45%
old photo of a person 44.9%

Text analysis

Amazon

FILTER
FILTER QUEEN
QUEEN
REMINGTON
MJI7
MJI7 YE33A ECHA
YE33A
ECHA

Google

MJI7 YT3RA A REMINGTON
A
REMINGTON
MJI7
YT3RA