Human Generated Data

Title

Co. I. 2nd Inf. N. G. I., Camp Cosgrove, Wash.

Date

c. 1910

People

Artist: E. L. Meyer, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, 2.2002.1872

Human Generated Data

Title

Co. I. 2nd Inf. N. G. I., Camp Cosgrove, Wash.

People

Artist: E. L. Meyer, American 20th century

Date

c. 1910

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, 2.2002.1872

Machine Generated Data

Tags

Amazon
created on 2022-01-22

Person 99.4
Human 99.4
Person 99.1
Person 98.6
Person 98.3
Person 97.8
Painting 92.1
Art 92.1
Person 90.9
Transportation 83.4
Vehicle 81.2
Person 76.5
Person 67.7
People 59.7
Military 57.7
Military Uniform 57.2
Horse Cart 56.5
Wagon 56.5
Person 48.9
Person 44.3

Clarifai
created on 2023-10-26

people 99.5
art 99.2
cavalry 98.7
painting 98.1
picture frame 97.8
group 96.2
tree 96.2
vintage 95.9
family 95.5
dog 94.9
museum 94.2
landscape 94.1
mammal 94
album 93.8
wood 93
window 92.4
child 92.4
furniture 92
portrait 91.6
print 91.4

Imagga
created on 2022-01-22

television 86
telecommunication system 55.2
old 52.9
vintage 51.3
grunge 51.1
antique 50.2
frame 48.4
wall 47.9
retro 43.4
texture 41.7
border 36.2
damaged 31.5
rusty 31.4
material 31.2
ancient 30.3
empty 30
aged 29.9
design 29.8
dirty 28
blank 27.4
art 25.4
structure 25.2
screen 25.2
space 24
pattern 23.2
windowsill 22.8
paper 22.7
monitor 22.6
backdrop 21.4
grungy 20.9
wallpaper 20.7
old fashioned 20
rough 19.1
sill 18.2
graphic 18.2
decoration 18.1
interior 17.7
textured 17.5
crumpled 17.5
decay 17.4
rust 17.3
obsolete 17.2
grime 16.6
mottled 16.6
stains 16.5
faded 16.5
window screen 16.5
weathered 16.1
wood 15.8
frames 15.6
fracture 15.6
spot 15.3
grain 14.8
room 14.6
crack 14.5
messy 14.5
structural member 14.5
gray 14.4
black 13.8
framework 13.8
tracery 13.6
ragged 13.6
edge 13.5
parchment 13.4
surface 13.2
window 13
crease 12.7
stone 12.7
broadcasting 12.6
succulent 12.6
your 12.6
brown 12.5
display 12.2
historic 11.9
broad 11.8
dark 11.7
building 11.6
dirt 11.5
text 11.3
digital 11.3
paint 10.9
highly 10.8
photographic 10.8
album 10.7
scrapbook 10.7
detailed 10.6
stain 10.6
film 10.6
aging 10.5
concrete 10.5
protective covering 10.4
support 10.1
nobody 10.1
scratches 9.8
gallery 9.8
exhibition 9.8
abandoned 9.8
computer 9.7
backgrounds 9.7
home 9.6
architecture 9.4
equipment 9.3
cement 9.1
ornate 9.1
color 8.9
rotting 8.9
detail 8.8
blackboard 8.8
scratch 8.8
stained 8.7
entertainment 8.3
style 8.2
painting 8.1
telecommunication 8
wooden 7.9
overlay 7.9
noisy 7.9
designed 7.9
layered 7.9
mess 7.9
noise 7.8
covering 7.8
layer 7.7
collage 7.7
construction 7.7
mask 7.7
card 7.7
illuminated 7.6
textures 7.6
element 7.4
page 7.4
gold 7.4
decor 7.1
indoors 7

Google
created on 2022-01-22

Microsoft
created on 2022-01-22

room 98.8
gallery 98.1
scene 94.4
indoor 92.5
person 92.4
window 90.3
text 81.1
old 74.9
picture frame 66.3
mammal 50.1
painting 19.1

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 28-38
Gender Male, 91.9%
Happy 84.1%
Calm 11.6%
Surprised 2.8%
Sad 0.4%
Confused 0.3%
Angry 0.3%
Disgusted 0.3%
Fear 0.2%

AWS Rekognition

Age 23-33
Gender Male, 97.6%
Calm 59.1%
Sad 34.6%
Confused 1.8%
Happy 1.4%
Surprised 1.1%
Angry 0.8%
Fear 0.6%
Disgusted 0.5%

AWS Rekognition

Age 37-45
Gender Male, 100%
Calm 99.6%
Confused 0.1%
Sad 0.1%
Fear 0.1%
Angry 0%
Surprised 0%
Disgusted 0%
Happy 0%

AWS Rekognition

Age 20-28
Gender Female, 71.5%
Calm 47.8%
Sad 34.9%
Surprised 7.1%
Disgusted 3.5%
Fear 2.8%
Angry 1.8%
Happy 1.3%
Confused 0.9%

AWS Rekognition

Age 26-36
Gender Male, 96.4%
Calm 84.7%
Sad 8.9%
Angry 2.5%
Confused 1.3%
Surprised 1%
Happy 0.6%
Disgusted 0.5%
Fear 0.4%

AWS Rekognition

Age 22-30
Gender Male, 99.7%
Confused 78.1%
Calm 7.9%
Surprised 5.6%
Angry 3.9%
Sad 3.4%
Disgusted 0.5%
Fear 0.4%
Happy 0.3%

AWS Rekognition

Age 31-41
Gender Male, 100%
Sad 63%
Calm 22.7%
Surprised 5.8%
Angry 3.3%
Fear 2.8%
Happy 1.1%
Confused 0.7%
Disgusted 0.6%

AWS Rekognition

Age 20-28
Gender Male, 100%
Calm 99.9%
Sad 0%
Surprised 0%
Angry 0%
Confused 0%
Disgusted 0%
Happy 0%
Fear 0%

AWS Rekognition

Age 45-53
Gender Male, 99.8%
Calm 88.5%
Surprised 4.8%
Happy 3.9%
Disgusted 1.2%
Angry 0.7%
Sad 0.6%
Confused 0.2%
Fear 0.2%

AWS Rekognition

Age 35-43
Gender Male, 97.5%
Happy 64.1%
Calm 18.3%
Sad 11.7%
Surprised 1.7%
Angry 1.2%
Disgusted 1.1%
Fear 0.9%
Confused 0.8%

AWS Rekognition

Age 20-28
Gender Male, 100%
Calm 99.4%
Sad 0.2%
Happy 0.2%
Angry 0.1%
Confused 0.1%
Surprised 0.1%
Disgusted 0%
Fear 0%

AWS Rekognition

Age 48-56
Gender Male, 100%
Happy 52%
Sad 16.6%
Angry 13.9%
Calm 6%
Surprised 4.7%
Fear 3.7%
Disgusted 2.3%
Confused 0.8%

AWS Rekognition

Age 23-33
Gender Male, 63%
Happy 44.4%
Sad 35.7%
Calm 8.7%
Surprised 3.5%
Fear 2.2%
Angry 2.1%
Disgusted 2.1%
Confused 1.2%

AWS Rekognition

Age 28-38
Gender Male, 59.9%
Calm 99.5%
Angry 0.3%
Sad 0.1%
Surprised 0.1%
Disgusted 0%
Confused 0%
Happy 0%
Fear 0%

AWS Rekognition

Age 22-30
Gender Male, 96.2%
Calm 97.1%
Disgusted 0.8%
Happy 0.7%
Surprised 0.5%
Sad 0.4%
Angry 0.3%
Confused 0.2%
Fear 0.1%

AWS Rekognition

Age 20-28
Gender Male, 99.8%
Calm 95%
Angry 1.7%
Surprised 1.3%
Sad 0.7%
Happy 0.5%
Fear 0.4%
Disgusted 0.3%
Confused 0.2%

AWS Rekognition

Age 16-22
Gender Male, 100%
Calm 98.4%
Angry 0.5%
Surprised 0.4%
Sad 0.3%
Disgusted 0.2%
Happy 0.1%
Confused 0.1%
Fear 0.1%

AWS Rekognition

Age 30-40
Gender Male, 98.5%
Angry 85.7%
Sad 6.5%
Calm 2.9%
Surprised 1.8%
Happy 0.8%
Disgusted 0.8%
Fear 0.8%
Confused 0.7%

AWS Rekognition

Age 20-28
Gender Male, 100%
Calm 88.8%
Angry 4.4%
Confused 3.6%
Sad 1.9%
Disgusted 0.5%
Surprised 0.4%
Fear 0.2%
Happy 0.2%

AWS Rekognition

Age 24-34
Gender Male, 100%
Happy 50%
Calm 31.3%
Disgusted 5.9%
Angry 4.4%
Surprised 3.7%
Sad 2.7%
Confused 1.2%
Fear 0.6%

AWS Rekognition

Age 24-34
Gender Male, 83.1%
Calm 99.7%
Angry 0.1%
Surprised 0.1%
Sad 0.1%
Happy 0%
Disgusted 0%
Fear 0%
Confused 0%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very likely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Likely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Possible
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Likely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Unlikely

Feature analysis

Amazon

Person 99.4%
Painting 92.1%

Categories

Imagga

paintings art 90.7%
interior objects 7.2%

Text analysis

Amazon

INENGI
Photo
Co.I.2 vo INENGI
Co.I.2
Wasn
SCATICA
ваметей
vo