Human Generated Data

Title

Untitled (men looking at vacuum demonstration in shop window)

Date

1948

People

Artist: Jack Gould, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.14604

Human Generated Data

Title

Untitled (men looking at vacuum demonstration in shop window)

People

Artist: Jack Gould, American

Date

1948

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.14604

Machine Generated Data

Tags

Amazon
created on 2022-01-29

Person 98.9
Human 98.9
Person 97.9
Clothing 97
Apparel 97
Tie 95.4
Accessories 95.4
Accessory 95.4
Door 75
Bicycle 67.5
Transportation 67.5
Vehicle 67.5
Bike 67.5
Floor 66.3
Appliance 59.6
Overcoat 59.3
Coat 59.3
Suit 57.4
Sun Hat 57.1
Hat 57.1
Cleaning 55.5

Clarifai
created on 2023-10-27

people 99.5
monochrome 98.5
lid 98
vehicle 95.9
adult 95.4
man 94.4
veil 92
administration 90.5
transportation system 90.1
outfit 89.9
two 88.1
wear 84.1
nostalgia 83.8
three 83
one 82.7
watercraft 81.3
uniform 76.1
military 74.4
offense 73.1
group together 72

Imagga
created on 2022-01-29

cleaner 32.9
device 25.9
man 17.5
people 16.2
interior 15.9
home 15.1
male 14.9
equipment 14.5
person 14.4
clean 14.2
stethoscope 13.5
medical 13.2
building 12.5
working 12.4
room 12.1
instrument 12
men 12
health 11.8
house 11.7
adult 11.1
work 10.2
hand 9.9
retro 9.8
modern 9.8
human 9.7
medicine 9.7
chemical 9.6
glass 9.6
floor 9.3
occupation 9.2
window 9.1
fashion 9
one 8.9
bathroom 8.9
indoors 8.8
urban 8.7
chemistry 8.7
laboratory 8.7
water 8.7
research 8.6
old 8.4
fun 8.2
technology 8.2
worker 8.1
medical instrument 8
lifestyle 7.9
face 7.8
high 7.8
lab 7.8
play 7.7
professional 7.7
wall 7.7
active 7.7
repair 7.7
energy 7.6
player 7.5
chair 7.5
treatment 7.3
music 7.3
machine 7.2
vacuum 7.2
science 7.1
architecture 7

Google
created on 2022-01-29

Microsoft
created on 2022-01-29

text 98.2
musical instrument 94.5
clothing 93.1
drum 85.4
person 85
black and white 73.3
man 71.8

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 35-43
Gender Male, 98%
Surprised 32.9%
Sad 20.5%
Happy 13.4%
Fear 12.3%
Angry 7.4%
Disgusted 5.5%
Calm 5.3%
Confused 2.7%

AWS Rekognition

Age 24-34
Gender Male, 93.8%
Calm 86.3%
Confused 7.5%
Happy 2.8%
Surprised 1.6%
Angry 0.9%
Sad 0.4%
Disgusted 0.3%
Fear 0.2%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Possible
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Tie
Bicycle
Person 98.9%
Person 97.9%
Tie 95.4%
Bicycle 67.5%

Categories

Imagga

paintings art 98.9%

Text analysis

Amazon

QUEEN
FILTER QUEEN
FILTER
REMINGTON
YT3RAS
MJIR
MJIR YT3RAS ACHA
LOUIS
ACHA

Google

MJI7 YT3RA 0A REMINGTON SE LOUI QUEEN FILTER
MJI7
YT3RA
0A
REMINGTON
SE
LOUI
QUEEN
FILTER