Human Generated Data

Title

Untitled (girls in shorts seated in circle formation)

Date

1944

People

Artist: John Deusing, American active 1940s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.1593

Human Generated Data

Title

Untitled (girls in shorts seated in circle formation)

People

Artist: John Deusing, American active 1940s

Date

1944

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2021-12-14

Human 98.6
Person 98.6
Person 98.5
Person 98.5
Person 98.4
Person 96.7
Person 96.6
Person 95.6
Person 91.9
Drawing 89.4
Art 89.4
Person 83.3
People 81.4
Person 81.1
Person 77.7
Person 77.3
Person 74.8
Sketch 72.3
Crowd 60.7
Person 60.3
Parade 59.4
Person 57.1
Building 55.5
Arena 55.5
Amphitheatre 55.5
Architecture 55.5
Amphitheater 55.5

Imagga
created on 2021-12-14

gear 34.5
design 26.4
graphic 23.3
art 20.9
pattern 20.5
structure 20.5
fountain 20.4
texture 20.1
digital 19.4
shape 16.3
light 15.4
element 14.9
tile 14.9
motion 14.6
wallpaper 14.5
grunge 14.5
style 14.1
artistic 13.9
decorative 13.4
space 13.2
curve 13.1
color 12.8
tray 12.6
futuristic 12.6
detail 11.3
fractal 11
decoration 11
star 10.8
lines 10.8
backdrop 10.7
retro 10.6
negative 10.6
modern 10.5
computer 10.4
receptacle 10.2
symbol 10.1
3d 10.1
global 10
frame 10
transparent 9.9
crown 9.8
drawing 9.7
business 9.7
wave 9.5
ornament 9.5
water 9.3
travel 9.2
effect 9.1
vintage 9.1
technology 8.9
paper 8.8
film 8.5
web 8.4
science 8
container 8
cool 8
world 8
shiny 7.9
card 7.8
black 7.8
line 7.7
old 7.7
spiral 7.6
flowing 7.5
generated 7.4
glowing 7.4
fantasy 7.2
holiday 7.2
splash 7.1
creative 7.1
conceptual 7.1
sky 7

Google
created on 2021-12-14

Art 73.9
Font 68.5
Arch 63.1
Plant 59.6
Circle 58.1
Monochrome 57.8
Sport venue 57.7
History 56.8
Auto part 56.5
Landscape 55.1
Eyelash 55.1
Monochrome photography 55
Rectangle 52.6
Automotive tire 51.6

Microsoft
created on 2021-12-14

text 99.5
sketch 81.5
drawing 79.3
black and white 71.6
old 66.5
person 61.9

Face analysis

Amazon

Google

AWS Rekognition

Age 50-68
Gender Male, 89.3%
Calm 78.1%
Happy 9.9%
Disgusted 4%
Sad 2.2%
Confused 2.1%
Angry 2%
Surprised 1.5%
Fear 0.3%

AWS Rekognition

Age 31-47
Gender Male, 95%
Calm 86.1%
Happy 8.8%
Surprised 2%
Sad 1.4%
Confused 1.1%
Disgusted 0.2%
Angry 0.2%
Fear 0.1%

AWS Rekognition

Age 36-52
Gender Male, 75.3%
Calm 95.8%
Happy 1.2%
Sad 1.1%
Surprised 0.6%
Confused 0.4%
Disgusted 0.3%
Fear 0.3%
Angry 0.3%

AWS Rekognition

Age 49-67
Gender Female, 64%
Sad 56.1%
Calm 32.2%
Happy 8.1%
Confused 1.2%
Angry 0.9%
Fear 0.9%
Surprised 0.4%
Disgusted 0.2%

AWS Rekognition

Age 24-38
Gender Male, 78.7%
Calm 86.3%
Sad 6.4%
Surprised 2.5%
Angry 1.4%
Happy 1.3%
Fear 1.2%
Disgusted 0.5%
Confused 0.3%

AWS Rekognition

Age 38-56
Gender Female, 54.7%
Calm 89.4%
Surprised 4.5%
Happy 2.2%
Sad 1.5%
Disgusted 1%
Confused 0.5%
Angry 0.5%
Fear 0.2%

AWS Rekognition

Age 14-26
Gender Female, 64.2%
Happy 93.7%
Calm 4.8%
Confused 0.5%
Surprised 0.4%
Sad 0.3%
Angry 0.1%
Fear 0.1%
Disgusted 0.1%

AWS Rekognition

Age 23-35
Gender Female, 73.9%
Calm 93.6%
Surprised 2.3%
Disgusted 1%
Confused 0.9%
Happy 0.7%
Sad 0.6%
Fear 0.5%
Angry 0.4%

AWS Rekognition

Age 28-44
Gender Female, 53.8%
Calm 68%
Happy 22.6%
Sad 8%
Confused 0.7%
Angry 0.3%
Fear 0.2%
Disgusted 0.2%
Surprised 0.1%

AWS Rekognition

Age 21-33
Gender Female, 75%
Calm 43.3%
Sad 39.1%
Happy 11.8%
Angry 1.5%
Disgusted 1.2%
Confused 1.1%
Surprised 1%
Fear 1%

AWS Rekognition

Age 32-48
Gender Female, 78.2%
Calm 91.7%
Sad 3.9%
Confused 1.7%
Surprised 1.2%
Fear 0.7%
Happy 0.6%
Angry 0.2%
Disgusted 0.1%

AWS Rekognition

Age 5-15
Gender Female, 64.5%
Calm 53.2%
Happy 28.8%
Sad 11.1%
Confused 2.5%
Disgusted 2%
Angry 1.3%
Surprised 0.6%
Fear 0.5%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 98.6%

Captions

Microsoft

a vintage photo of a person 47%
an old photo of a person 46.4%
a vintage photo of a person 36.7%

Text analysis

Amazon

KODAKAAITW
plasti

Google

-YT3RA°2
-YT3RA°2