Human Generated Data

Title

Untitled (contact sheet)

Date

20th century

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.4589

Human Generated Data

Title

Untitled (contact sheet)

People

Artist: Ben Shahn, American 1898 - 1969

Date

20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.4589

Machine Generated Data

Tags

Amazon
created on 2022-01-08

Person 97.1
Human 97.1
Person 92.5
Person 89.1
Helmet 82.6
Clothing 82.6
Apparel 82.6
Home Decor 69
Wheel 60
Machine 60
Window 58.1
Meal 57
Food 57

Clarifai
created on 2023-10-25

negative 98.9
movie 96.5
art 93.2
people 92.5
old 91.8
technology 91.3
vintage 91.3
photograph 90.7
slide 90.7
analogue 89.9
analog 89.5
screen 89.3
industry 88.5
steel 88.5
desktop 88.2
filmstrip 88.1
retro 87.7
rust 86.7
abstract 85.7
noisy 82.4

Imagga
created on 2022-01-08

equipment 73.3
electronic equipment 57.2
amplifier 48.4
technology 31.1
sequencer 28
apparatus 23.9
computer 23.3
digital 19.4
device 18.8
industry 16.2
network 15.7
object 13.9
cassette 13.5
sound 13.1
film 12.7
music 12.6
cassette tape 12.5
cable 12.4
electronics 12.3
electronic 12.1
connection 11.9
data 11.9
board 11.7
business 11.5
black 11.4
media 11.4
old 11.1
texture 11.1
switch 10.8
magnetic tape 10.7
negative 10.7
retro 10.6
silver 10.6
three dimensional 10.3
communication 10.1
graphics 10
plug 9.7
video 9.7
router 9.6
audio 9.6
effects 9.5
memory device 9.4
grunge 9.4
3d 9.3
studio 9.1
vintage 9.1
information 8.8
server 8.7
panel 8.7
tape 8.6
industrial 8.2
office 8
design 7.9
cables 7.8
cinema 7.8
modern 7.7
container 7.5
wire 7.5
close 7.4
entertainment 7.4
light 7.3
plastic 7.3
metal 7.2
drive 7.1
button 7

Google
created on 2022-01-08

Microsoft
created on 2022-01-08

text 98.5
indoor 89.4
appliance 75.1

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 45-53
Gender Male, 99.6%
Calm 95.9%
Sad 2.8%
Angry 0.4%
Disgusted 0.3%
Surprised 0.3%
Confused 0.2%
Fear 0.1%
Happy 0.1%

AWS Rekognition

Age 48-54
Gender Male, 100%
Calm 83.8%
Happy 14.4%
Confused 0.8%
Angry 0.4%
Surprised 0.4%
Disgusted 0.2%
Sad 0.1%
Fear 0%

AWS Rekognition

Age 18-24
Gender Male, 55.5%
Calm 79.5%
Sad 12.9%
Happy 5.1%
Surprised 0.7%
Fear 0.5%
Angry 0.5%
Disgusted 0.4%
Confused 0.2%

Microsoft Cognitive Services

Age 57
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Likely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Possible
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 97.1%
Helmet 82.6%
Wheel 60%

Categories

Imagga

cars vehicles 96.1%
interior objects 2.1%

Captions

Microsoft
created on 2022-01-08

a close up of a machine 27.6%

Text analysis

Amazon

18
ATLANTIC

Google

18 13MUD ATLANTIC SA 18
18
13MUD
ATLANTIC
SA