Human Generated Data

Title

Untitled (auction, New Carlisle, Ohio)

Date

July 30, 1938

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.586

Copyright

© President and Fellows of Harvard College

Human Generated Data

Title

Untitled (auction, New Carlisle, Ohio)

People

Artist: Ben Shahn, American 1898 - 1969

Date

July 30, 1938

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.586

Copyright

© President and Fellows of Harvard College

Machine Generated Data

Tags

Amazon
created on 2023-10-06

Computer Hardware 100
Electronics 100
Hardware 100
Monitor 100
Screen 100
TV 99.6
Person 98.2
Person 98.1
Furniture 95.9
Table 95.9
Dining Table 94.1
Face 91.8
Head 91.8
Person 88.7
Accessories 86.7
Formal Wear 86.7
Tie 86.7
Computer 67.1
Desk 66.2
Chair 59.8
Architecture 57.2
Building 57.2
Indoors 57.2
Living Room 57.2
Room 57.2
Art 56.5
Painting 56.5
Photography 55.6
Dining Room 55.1

Clarifai
created on 2018-05-11

people 99.1
furniture 98.3
one 97.8
room 97.7
no person 96.3
seat 96.2
adult 95.7
indoors 93.9
two 93.6
chair 93.5
group 90.4
retro 89.3
wear 87.8
man 87.6
music 87.1
vehicle 86.6
table 85.7
art 85
home 84.8
leader 82

Imagga
created on 2023-10-06

telephone 37.6
pay-phone 32.1
electronic equipment 21.3
parking meter 21
box 20.8
equipment 19.3
chair 18.3
mailbox 18.2
timer 17
container 16.4
business 15.2
technology 14.8
grand piano 14.2
timepiece 14.1
measuring instrument 13.9
computer 13.7
old 13.2
phone 12.9
wall 12.8
device 12.8
wood 12.5
piano 12.3
empty 12
black 12
call 12
copy 11.5
vintage 11
city 10.8
retro 10.6
hand 10.6
office 10.4
seat 10.2
space 10.1
keyboard instrument 9.6
stringed instrument 9.5
person 9.5
grunge 9.4
percussion instrument 9.3
street 9.2
wooden 8.8
man 8.7
work 8.6
blank 8.6
outside 8.5
building 8.5
modern 8.4
camera 8.4
people 8.4
outdoors 8.2
dirty 8.1
instrument 8.1
light 8
working 7.9
desk 7.9
urban 7.9
paper 7.8
architecture 7.8
scale 7.7
texture 7.6
snow 7.6
screen 7.6
musical instrument 7.4
style 7.4
laptop 7.4
businesswoman 7.3
looking 7.2
home 7.2
adult 7.1

Google
created on 2018-05-11

Microsoft
created on 2018-05-11

indoor 85.9
seat 32.3

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 22-30
Gender Male, 100%
Calm 99.7%
Surprised 6.3%
Fear 5.9%
Sad 2.2%
Angry 0.1%
Confused 0%
Happy 0%
Disgusted 0%

AWS Rekognition

Age 28-38
Gender Female, 100%
Calm 96.5%
Surprised 6.3%
Fear 5.9%
Sad 3.4%
Confused 0.1%
Angry 0%
Happy 0%
Disgusted 0%

Microsoft Cognitive Services

Age 35
Gender Male

Microsoft Cognitive Services

Age 28
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 98.2%
Tie 86.7%
Chair 59.8%

Categories

Captions

Microsoft
created on 2018-05-11

a person sitting in a chair 38.4%