Human Generated Data

Title

Untitled (album compiled by the Campbell family on a missionary trip to Africa)

Date

1946-1950

People

Artist: Unidentified Artist,

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Janet and Daniel Tassel, 2007.219.11

Human Generated Data

Title

Untitled (album compiled by the Campbell family on a missionary trip to Africa)

People

Artist: Unidentified Artist,

Date

1946-1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Janet and Daniel Tassel, 2007.219.11

Machine Generated Data

Tags

Amazon
created on 2019-11-05

Advertisement 99.3
Human 99.3
Person 99.3
Person 98.5
Person 98.3
Paper 97.1
Brochure 97.1
Flyer 97.1
Person 95.7
Person 95.7
Person 95.6
Person 95.2
Poster 95
Person 92.7
Person 90.2
Person 78.9
Collage 77.1
Person 71.9
Person 64.2
Person 59.9
Performer 58.8
Person 47.7
Person 44.9

Clarifai
created on 2019-11-05

people 99.6
adult 98.9
monochrome 98.1
man 96.2
group 93.7
illustration 93.1
woman 92.7
indoors 91.2
furniture 88.9
wear 88
one 86.8
sit 86.6
no person 86.5
music 85.1
war 81.2
military 80.8
chair 80.3
outdoors 80.3
two 80.1
retro 79.3

Imagga
created on 2019-11-05

stereo 32.3
device 29.2
business 27.3
money 21.3
digital 21.1
technology 20.8
cash 20.1
computer 20
equipment 18.2
finance 17.7
communication 17.6
data 17.3
currency 16.2
close 16
information 15.9
audio system 15.9
electronic 15.9
electronic equipment 15.8
dollar 15.8
banking 15.6
wealth 15.3
circuit board 14.8
hardware 14.4
financial 14.3
bank 13.4
paper 13.3
electrical device 13.1
security 12.9
black 12.6
perfume 12.6
connect 12.4
connection 11.9
object 11.7
electronics 11.4
key 11.4
network 11.1
hand 10.6
blister pack 10.5
metal 10.5
tech 10.4
savings 10.2
chip 10.2
design 10.1
closeup 10.1
electricity 10.1
investment 10.1
market 9.8
toiletry 9.7
dollars 9.7
cable 9.5
bill 9.5
block 9.5
memory 9.5
symbol 9.4
battery 9.3
container 9.2
plug 8.9
silver 8.8
banknote 8.7
treasure 8.7
electrical 8.6
economy 8.3
stack 8.3
note 8.3
packaging 8.2
board 8.1
conceptual 7.9
lighter 7.8
grunge 7.7
old 7.7
box 7.6
storage 7.6
display 7.6
stock 7.5
phone 7.4
packet 7.3
drive 7.1
button 7

Google
created on 2019-11-05

Microsoft
created on 2019-11-05

text 100
person 97.8
clothing 96
man 91.6
book 90.8
poster 90.6
posing 63.2

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 51-69
Gender Male, 54.9%
Fear 45%
Calm 54.9%
Happy 45%
Disgusted 45%
Angry 45%
Confused 45%
Sad 45%
Surprised 45%

AWS Rekognition

Age 48-66
Gender Female, 53%
Angry 46.1%
Disgusted 45.3%
Fear 45.8%
Calm 52%
Happy 45%
Sad 45.4%
Surprised 45.2%
Confused 45.1%

AWS Rekognition

Age 62-78
Gender Male, 54.5%
Fear 45%
Sad 46.2%
Angry 45.1%
Disgusted 45%
Calm 53.6%
Happy 45%
Confused 45.1%
Surprised 45%

AWS Rekognition

Age 12-22
Gender Female, 50.9%
Surprised 45%
Disgusted 45%
Fear 45.2%
Confused 45.2%
Sad 46.3%
Angry 45.1%
Happy 48.6%
Calm 49.6%

AWS Rekognition

Age 3-11
Gender Female, 50.5%
Surprised 45%
Confused 45%
Sad 51.6%
Happy 47.6%
Fear 45%
Angry 45.1%
Calm 45.6%
Disgusted 45%

AWS Rekognition

Age 61-77
Gender Male, 50.5%
Calm 49.6%
Angry 49.5%
Sad 49.5%
Surprised 49.5%
Fear 49.5%
Happy 50.4%
Disgusted 49.5%
Confused 49.5%

AWS Rekognition

Age 13-23
Gender Female, 51.8%
Calm 45.3%
Angry 45.1%
Disgusted 45.3%
Fear 45.2%
Surprised 45.1%
Happy 53.4%
Sad 45.4%
Confused 45.1%

AWS Rekognition

Age 17-29
Gender Female, 50.2%
Happy 49.6%
Disgusted 49.6%
Sad 49.5%
Calm 49.5%
Angry 50%
Fear 49.6%
Surprised 49.5%
Confused 49.5%

AWS Rekognition

Age 23-37
Gender Female, 50.1%
Calm 50.5%
Sad 49.5%
Surprised 49.5%
Angry 49.5%
Fear 49.5%
Disgusted 49.5%
Confused 49.5%
Happy 49.5%

AWS Rekognition

Age 28-44
Gender Female, 50.1%
Surprised 49.5%
Fear 50.4%
Disgusted 49.5%
Sad 49.6%
Happy 49.5%
Angry 49.5%
Calm 49.5%
Confused 49.5%

AWS Rekognition

Age 13-25
Gender Female, 50.4%
Fear 49.6%
Angry 49.5%
Disgusted 49.5%
Surprised 49.7%
Happy 49.9%
Confused 49.5%
Calm 49.7%
Sad 49.6%

AWS Rekognition

Age 8-18
Gender Female, 50.3%
Calm 49.5%
Surprised 49.5%
Happy 49.5%
Sad 49.5%
Angry 50.4%
Fear 49.5%
Disgusted 49.5%
Confused 49.5%

AWS Rekognition

Age 29-45
Gender Female, 50.4%
Happy 49.8%
Disgusted 49.5%
Surprised 49.6%
Sad 49.7%
Calm 49.5%
Confused 49.5%
Fear 49.7%
Angry 49.6%

AWS Rekognition

Age 50-68
Gender Female, 50.2%
Sad 49.5%
Angry 49.5%
Fear 49.5%
Calm 49.7%
Happy 49.6%
Disgusted 49.5%
Confused 49.5%
Surprised 50.1%

Feature analysis

Amazon

Person 99.3%
Poster 95%

Categories

Text analysis

Amazon

Marian
Arpril
MARINE
Shinn
Sandra,
Sandra, Dot Marian
1946
BOUND Arpril 25, 1946
MNeil,
Dot
25,
Mrs.
BOUND
Neil
Pa
ES.Campbell,
TIGER"
Marian,
MONeil. Dot Mr.Roberts Shinn
Mrs. Mrs. Rev. Mrs. ES.Campbell,
Pop Me Neil Marian, Mrs. MNeil, Dot. Pa
Dot.
Pop
AFPICA BOU
"S.S S MARINE TIGER"
MONeil.
Mr.Roberts
Rev.
"S.S
S
Me
y

Google

April 25, 1946 AFRICA BOUND Mrs. MeNeil Dot, Mr.Roberts Mrs.Shinn Rev. Mrs.F.S.Campbell, Sandra, Dot+ Marian Dot S.S MARINE TIGER Pop Mc Neil Marian, Mrs. McNei l, Dot, Pa
April
25,
1946
AFRICA
BOUND
Mrs.
MeNeil
Dot,
Mr.Roberts
Mrs.Shinn
Rev.
Sandra,
Dot+
Marian
Dot
S.S
MARINE
TIGER
Mc
Mrs.F.S.Campbell,
Pop
Neil
Marian,
McNei
l,
Pa