Human Generated Data

Title

Untitled (worker and machinery)

Date

1995

People

Artist: Stephen Hass, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, Gift of the artist, 2.2002.1686

Human Generated Data

Title

Untitled (worker and machinery)

People

Artist: Stephen Hass, American 20th century

Date

1995

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-09

Human 97.9
Person 97.9
Machine 89.6
Electronics 79.4
Camera 79.4

Imagga
created on 2022-01-09

container 47.7
milk can 35
can 34.8
device 32.2
stove 25.6
machine 21.3
pot 20
vessel 19.3
kitchen 18.8
metal 17.7
equipment 17.6
coffeepot 17.1
glass 17.1
cocktail shaker 16.4
espresso maker 15.3
fire extinguisher 15.3
fire 15
bottle 14.5
technology 13.3
milking machine 13.3
shaker 13.1
food 12.7
old 12.5
coffee maker 12.3
drink 11.7
steel 11.5
tool 11.3
cooking 11.3
jug 11.3
antique 10.9
hot 10.9
kitchen appliance 10.4
cooking utensil 10.3
industry 10.2
black 9.6
liquid 9.6
handle 9.5
object 9.5
home appliance 9.4
heat 9.2
cook 9.1
digital 8.9
utensil 8.7
work 8.6
oil 8.4
safety 8.3
industrial 8.2
reflection 8.1
appliance 8.1
water 8
home 8
interior 8
shiny 7.9
3d 7.7
emergency 7.7
gas 7.7
effects 7.6
three dimensional 7.5
graphics 7.3
beverage 7.3
computer 7.2
bell 7.1
life 7

Google
created on 2022-01-09

Microsoft
created on 2022-01-09

indoor 99.2
wall 96.4
person 92.6
kitchen 89.9
black and white 89.9
text 61.4
preparing 53.9
cooking 30.8
kitchen appliance 16.1

Face analysis

Google

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 97.9%
Camera 79.4%

Captions

Microsoft

a person standing in front of a stove 54%
a person cooking in a kitchen 53.9%
a person standing in front of a stove 42.8%