Human Generated Data

Title

Grumman

Date

1978

People

Artist: Elaine Mayes, American born 1938

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, 2.2002.1683

Copyright

© Elaine Mayes

Human Generated Data

Title

Grumman

People

Artist: Elaine Mayes, American born 1938

Date

1978

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, 2.2002.1683

Copyright

© Elaine Mayes

Machine Generated Data

Tags

Amazon
created on 2022-01-09

Clarifai
created on 2023-10-25

people 99.2
one 96.6
room 94.8
woman 94.1
computer 92.1
family 87.6
science 87.5
monochrome 86.9
education 86.7
technology 86.6
school 85.1
girl 84.5
portrait 84.5
child 84.2
adult 81.3
desk 80.8
two 80.7
indoors 80.2
industry 79.6
furniture 76.7

Imagga
created on 2022-01-09

business 38.9
work 33.7
computer 32.6
working 31.8
office 27.2
device 26.2
laptop 25.9
technology 25.2
paper 22.8
desk 22.3
finance 22
hand 21.3
money 21.3
equipment 19.8
keyboard 19.6
pen 19.3
people 19
professional 17.3
financial 16.9
man 16.9
corporate 16.3
person 16.3
adult 15.7
close 15.4
success 15.3
male 14.9
calculator 14.9
table 14.2
job 14.2
businessman 14.1
data 13.7
workplace 13.3
notebook 13.1
hands 13
businesswoman 12.7
wealth 12.6
busy 12.5
document 12.1
human 12
worker 11.6
closeup 11.5
investment 11
indoors 10.5
plan 10.4
write 10.3
executive 10.1
occupation 10.1
communication 10.1
modern 9.8
market 9.8
information 9.7
digital 9.7
hardware 9.6
women 9.5
writing 9.4
newspaper 9.3
teamwork 9.3
finger 9.2
banking 9.2
note 9.2
team 9
bank 9
electronic equipment 8.8
button 8.8
home 8.8
report 8.7
dollars 8.7
smiling 8.7
education 8.7
graph 8.6
engineer 8.6
men 8.6
electronics 8.5
businesspeople 8.5
stock 8.4
portrait 8.4
electronic 8.4
manager 8.4
cash 8.2
key 8.1
school 8.1
object 8.1
interior 8
color 7.8
sitting 7.7
desktop 7.7
fingers 7.6
company 7.4
phone 7.4
inside 7.4
indoor 7.3
appliance 7.3
smile 7.1
idea 7.1

Google
created on 2022-01-09

Microsoft
created on 2022-01-09

text 99.8
drawing 93
person 92.8
indoor 89
clothing 86.9
black and white 82.4
woman 55.6
desk 10.6

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 24-34
Gender Female, 99.4%
Sad 66.6%
Calm 32.5%
Fear 0.3%
Angry 0.1%
Confused 0.1%
Disgusted 0.1%
Surprised 0.1%
Happy 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99%

Categories

Imagga

paintings art 99.6%

Captions

Text analysis

Amazon

8